Thursday, November 20, 2008

Update: Also explained the concept and put up the code for the left navigation provider here.

Update: Added the Web.config entries at the bottom.

By popular demand here is the writeup for the Custom Navigation Provider for SharePoint 2007 I wrote last year. Be sure to check it our and send me feedback.

So here is the use case. You would like to create a consistent navigation heirarchy in your SharePoint environment. The OOB navigation is not going to work for you because your site has probably grown to many site collections and having a consistent navigation is a need. You do not want to change your navigation on every site collection when it needs to be changed. The appropriate users want to change the top navigation as needed without having full access the site.

I was faced with these challenges last year and so came up with the idea to write a custom navigation provider that can read from a list. The list can have folder heirarchies and those determine the levels and the dropdowns.

The list images and the changes that need to be made in the master page and web.config file are shown below for this to work.

We created custom site columns, custom content types and then a custom list that used these content types to allow users to easily build hierarchies that the navigation provider could read and deduce the navigation levels. Here is an example of the custom list for the top navigation content. The actual URLs below in the Url Link column have erased, but this should get the point across.

This is a view of the global navigation shared across all site collections. It includes one level of dropdowns, but those can be added by adding the list heirarchies and also tweaking the levels to show in the AspMenu in the SharePoint master page.Here are the changes that had to be made in the master page.

namespace CompanyXX.MOSS.Utilities.Navigation.Providers{//Assign the neccessary security permissions. TODO - Check the permissions required.[AspNetHostingPermissionAttribute(SecurityAction.LinkDemand, Level = AspNetHostingPermissionLevel.Minimal)][SharePointPermissionAttribute(SecurityAction.LinkDemand, ObjectModel = true)][AspNetHostingPermissionAttribute(SecurityAction.InheritanceDemand, Level = AspNetHostingPermissionLevel.Minimal)][SharePointPermissionAttribute(SecurityAction.InheritanceDemand, ObjectModel = true)]//This inherits from the PortalSiteMapProvider class in MOSS, just because it provides some of the functions I need.//You could just as easily write one for WSS.public class CustomTopNavProvider : PortalSiteMapProvider{//Create the in memory objects for storage and fast retreivalprotected SiteMapNodeCollection siteMapNodeColl;

//protected ArrayList childParentRelationship;

//These are only the top level nodes that will show in the top navprotected ArrayList topLevelNodes;

/// <summary>/// Load the top navigation into memory on the first call./// </summary>protected virtual void LoadTopNavigationFromList(){//Make sure to build the structure in memory only oncelock (this){if (rootNode != null){return;}else{//Initialiaze for the first timeSPSite rootSite = null;SPWeb rootWeb = null;SPList topnavList = null;

try{//Clear the top level nodes and the relationshipstopLevelNodes.Clear();childParentRelationship.Clear();

//instantiate sites and lists for now. This setting assumes that the list being//read from for the global top navigation is in the root web of the site collection listed in web.config.rootSite = new SPSite(ConfigurationManager.AppSettings["CompanyXXRootSite"]);rootWeb = rootSite.RootWeb;topnavList = rootWeb.Lists[ConfigurationManager.AppSettings["TopNavigationListName"]];

//Build the root node//Note: Any top level site of any site collection is assigned to be the rootNode here, not neccessarily the//top level site of the main site collectionrootNode = (PortalSiteMapNode)this.RootNode;

//We need to pass the PortalSiteMapNode constructor a PortalWebSiteMapNode object, so here it is//Note: This is the root node of 1 site collection, but the navigation will be shown in all site collections.PortalWebSiteMapNode pwsmn = rootNode as PortalWebSiteMapNode;

if (pwsmn != null){//Get the current folder to start. The navigation heirarchy can start at that folder.SPFolder currentFolder = topnavList.RootFolder.SubFolders[ConfigurationManager.AppSettings["NavigationListStartFolderName"]];

}catch (Exception ex){//There was a problem opening the site or the list.ExceptionManager.Publish(ex);}finally{//Dispose of the objectsif (rootWeb != null)rootWeb.Dispose();

if (rootSite != null)rootSite.Dispose();}}}}

/// <summary>/// Go through the list and build and save the PortalSiteMapNode nodes into memory based on the list heirarchy./// </summary>/// <param name="folder">this is the current folder to look for items</param>/// <param name="prtlWebSiteMapNode">the parent PortalWeb</param>/// <param name="parentSiteMapNode">the parent node</param>/// <param name="rootLevel">true if this is the first level, false if its a rootnode</param>protected virtual void BuildListNodes(SPWeb currentWeb, SPFolder folder, PortalWebSiteMapNode prtlWebSiteMapNode, PortalSiteMapNode parentSiteMapNode, bool rootLevel){// Get the collection of items from this folderSPQuery qry = new SPQuery();qry.Folder = folder;SortedList orderedNodes = new SortedList();int counter = 100; //for sorting items

try{//Browse through the items in the folder and create PortalSiteMapNodesSPListItemCollection ic = currentWeb.Lists[folder.ParentListId].GetItems(qry);foreach (SPListItem subitem in ic){//A SiteMapNode does not have target or audience information//SiteMapNode smn = new SiteMapNode(this, subitem.ID.ToString(), subitem.GetFormattedValue("UrlText"), subitem.Title, subitem.GetFormattedValue("UrlText"));

//Change the nodeTypes to Authored link for leaf nodes so that the GetChildNodes method is not called for those nodes.NodeTypes ntypes = NodeTypes.AuthoredLink;if (subitem.Folder != null)ntypes = NodeTypes.Default;

//Order the nodestry{int order = Convert.ToInt32(subitem.GetFormattedValue(ConfigurationManager.AppSettings["ItemOrder"]));orderedNodes.Add(order, psmn);}catch (Exception ex){//This will happen if 2 items are assigned the same order. Push one item to the last order.orderedNodes.Add(counter++, psmn);}

//if this is a folder, fetch and build the heirarchy under this folderif (subitem.Folder != null)BuildListNodes(currentWeb, subitem.Folder, prtlWebSiteMapNode, psmn, false);}

//Copy nodes in the right orderforeach (object portalSiteMapNode in orderedNodes.Values){//Add the node to the different collectionsif (rootLevel)topLevelNodes.Add(portalSiteMapNode);

/// <summary>/// This method will be called for all nodes and subnodes that can have children under them. For eg, NodeTypes.AuthoringLink type node/// cannot have child nodes./// </summary>/// <param name="node">The node to find child nodes for</param>/// <returns>The SiteMapNodeCollection which contains the children of the child nodes</returns>public override SiteMapNodeCollection GetChildNodes(System.Web.SiteMapNode node){return ComposeNodes(node);}

/// <summary>/// Compose nodes when the method is called. At a minimum, this method gets called with the root node of every/// site collection. We must attach the top level nodes to the root node for this method to get called for those/// nodes as well./// </summary>/// <param name="node"></param>/// <returns></returns>public virtual SiteMapNodeCollection ComposeNodes(System.Web.SiteMapNode node){//The SiteMapNodeCollection which represents the children of this nodeSiteMapNodeCollection children = new SiteMapNodeCollection();

try{//If an absolute rootnode, then add the top level children which are the same for every site collectionif (node == node.RootNode){//Serve it from cache if possible.//TODO: See if better way to do cachingobject topNodes = HttpRuntime.Cache["TopNavRootNodes"];if (topNodes != null && topNodes is SiteMapNodeCollection)return ((SiteMapNodeCollection)topNodes);

lock (this){//TODO: Check cache again. Threads may have been waiting at the lock.

//Two options available here.//1. Reload from the list when cache expires in case that is neededif (String.Compare(ConfigurationManager.AppSettings["ReloadTopNavOnCacheExpiry"], "true", true) == 1){rootNode = null;LoadTopNavigationFromList();}

//Else generate the top level nodes from memory. This must be done regardless of option 1 abovefor (int i = 0; i < topLevelNodes.Count; i++){children.Add(topLevelNodes[i] as PortalSiteMapNode);}

//Add them to the cacheHttpRuntime.Cache["TopNavRootNodes"] = children;}}else//Else this is a subnode, get only the children of that subnode{string nodeKey = node.Key;

//Get the children for this nodeKey from cache if they exist thereobject subNodes = HttpRuntime.Cache["TopNavRootNodes" + nodeKey];if (subNodes != null && subNodes is SiteMapNodeCollection)return ((SiteMapNodeCollection)subNodes);

lock (this){//Two options available here.//1. Reload from the list when cache expires in case that is needed//Commenting out because the top node should decide if we are going to get the tree from cache, not subnodes//if (String.Compare(ConfigurationManager.AppSettings["ReloadTopNavOnCacheExpiry"], "true", true) == 1)//{// rootNode = null;// LoadTopNavigationFromList();//}

//Else iterate through the nodes and find the children of this nodefor (int i = 0; i < childParentRelationship.Count; i++){string nKey = ((DictionaryEntry)childParentRelationship[i]).Key as string;

//if this is a childif (nodeKey == nKey){//Get the child from the arraylistPortalSiteMapNode child = (PortalSiteMapNode)(((DictionaryEntry)childParentRelationship[i]).Value);

if (child != null){children.Add(child as PortalSiteMapNode);}else{throw new Exception("ArrayLists not in sync.");}}}//Add the children to the cacheHttpRuntime.Cache["TopNavRootNodes" + nodeKey] = children;}}}catch (Exception ex){ExceptionManager.Publish(ex);

I have used this TopNavProvider to build the navigation for a MOSS intranet with ~4000 users, as well as an MOSS internet facing site with ~1.5 million visitors a month. Enjoy!!

I also created another custom navigation provider that reads the current navigation for every site from a similar list on that site and displays that somewhere else on that page (left or right navigation).

I recently ran into a situation where I encountered Access Denied errors when attempting to run stsadm on a dev Windows Server 2008 Web Edition. I checked to see if the user account I was using was a local administrator on the server and it was.

I looked around some more and I was not sure what was causing the problem. Then my friend suggested that we look into the User Account Control settings and those were enabled to help "protect" the server. Turning those off allowed me to run stsadm from the command line.

The User Account Control (UAC) is found under Control Panel --> User Accounts --> Turn User Account Control on or off.

Tuesday, November 11, 2008

The SharePoint guidance which focuses on WSS went live last week. This guidance provides architects and developers best practices on how to:

-- Make architectural decisions about feature factoring, packaging, and the appropriate usage of design patterns.-- Determine design tradeoffs for common decisions many developers encounter, such as when to use SharePoint lists or a database to store information.-- Design for testability, create unit tests, and run continuous integration.-- Set up different environments including the development, build, test, staging, and production environments.-- Manage the application life cycle through development, test, deployment, and upgrading.

This is really useful and I have been using it on a recent SharePoint extranet project I am doing. Be sure to take a look. You can find more information about it on Blaine's blog post here. You can also check out the content on MSDN here. I will be presenting a session on this at the Rocky Mountain user group next week with John Daniels who was very involved in this project, so anyone in the area please plan on attending to get more details.

Wednesday, October 15, 2008

Sometimes you come across an issue when you are building a new Virtual Machine in VPC 2007 where you go to the CD tab to attach an ISO image for an OS and the VPC console just sits there waiting and does not start the install.

The way to get around that is to get into the BIOS as the VPC is starting up by hitting the [Del] key, then go in and change the boot order to load from a CD-ROM first.

This will start the OS install as soon as you attach an iso image to the VPC.

Monday, October 6, 2008

Recently I did a CMS 2002 to MOSS migration. The project had enough challenges to be a good learning experience. I will be posting snippets of that knowledge here for others to use. This article is posted by my guest writer and friend Mike Dockery.

This article refers to the need to create custom News page layouts in SharePoint 2007 and surface the news articles (or any other kind of information from a list using CBQ).

Create custom content types to attach new page layouts to be used for creating news articles on a Publishing site that will be served up with unique styles in the Content Query web part. For reference, see http://blogs.msdn.com/markarend/archive/2006/07/25/678445.aspx

1. Custom Content Type•From the site collection level site settings, click Site content types.•Browse the gallery and check that no custom content types happen to already be created under Page Layout Content Types.•Click Create for this New Site Content Type.•For Name, call it Vanguard QTC News Article•For Description, enter a variation of: This is a Content Type to create news articles for Quote to Cash that will be rolled up into Vanguard news aggregators.•For its parent type select Article Page.•Select parent content type from the Page Layout Content Types•Put this into the IHS Content group. If this is the first member of the group, then you will have to make a New group with this name.•Click OK.•Recreate the different content types as needed.

2. Custom Page LayoutsUsing SharePoint Designer, open the site collection, then open the MasterPageGallery (in the _catalogs folder). Copy one or more layout files and rename the copies as in the following sample:

Associate with Custom Content TypeAfter copying the custom layouts, associate the custom content type with each one. From the top-level Site Settings, click Master Page and Page Layout link under Galleries, and you should see the new custom layouts that were added. Edit the properties of each one to set the associated content type; select the custom type that you created.

•Content Type should be set to Page Layout•Name will be the filename as copied in the previous step.•The Title property is the string that will appear on the page layout when a user edits the page to create or modify news. For example: IHS news article page with optional image at left.•The Description property is the string that appears when people are choosing the format for their news article. For example: This is an approved layout for IHS news articles.•The Associated Content Type property should be set to the custom content type that you created which will allow the news aggregator (CQWP) to find this article quickly: IHS Content•For Content Type Name, select the new site content type: IHS News Article•Click OK.•Check in and publish a major version.•Approve the page.

3. Customize the Page LayoutAt this point, you don’t have to do any further customization in order for the news aggregation scenario to work. But ...

Add a Description ControlThe Content Query web part that is used as a news aggregator has several different viewing formats available. Several of these show the Description property of the news article page. Unfortunately, there is not a control to enter the description on the page layout; to modify this out-of-box you have to edit the properties from the page library view. But we can easily add a control to the page layout to provide an input field for the description property, and this makes it much more useful for people writing news articles to show an abstract for the article.

The new ASPX page should still be checked out. Open the file in SharePoint Designer. Edit each of the copied layout files (if more than one) and add two items in Code view: a NoteField control for the Description field, and an EditModePanel to show instructions for the Description field.

Locate . Just above this before the , insert these controls into a new row in the table containing the date and byline.

Save the custom article page. Close SPD. Back through the interface, check the file in.

At this point, you are ready to create new pages with this layout.

Create a new page with this layout

This step is an example, it is not necessary to create the masters. Go to a site in the hierarchy of the portal that has the Publishing feature activated. By default, the Corporate Intranet Site and the Team Publishing Site have this feature, but it is easy to activate for other sites as well. Through Site Settings > Site features > Office SharePoint Server Publishing, then click Activate.

•Under Site Actions, click Create Page.

•Enter the Title of the article.

•Leave Description blank for now.

•For URL Name, the system will automatically populate it with the article title. Change it and shorten it depending on the actual title. Remove special characters.

•Select the new custom layout: IHS News Article

•For Page Layout, you must choose one of the following Content Types depending on the article and where you want it displayed in the roll up web parts:

•IHS Colleague Announcements

•IHS Engineering News

•IHS Leadership Team Messages

•IHS News Article

•Click Create.

The new article will be in edit mode.

•The Description section is important because this is what will be the short abstract shown within the Content Query Web Part. Insert meaningful text that summarizes the article. This must not exceed 255 characters.

•Enter the Article Date and Byline.

•In the Page Content section -- the primary content -- click Edit Content or the "Click here to add new content" link. Insert the text or else use the rich text editor to format the article.

•Rollup Image will output in the Content Query Web Part as a little 50 pixel square image next to the description/abstract. Ensure the image is small and legible.

•Click Publish to make the article available. If you need to edit the existing article, open the article, click Edit Page. Click Publish when finished.

4. CQWP Style Output Examples

IHSwwitTwoColumn

Title linked in blue and 'read more' in orange.

IHScomSingleCreated

Full date, Title and 'more' linked in blue

IHSNewsTwoColumn

Date, Title linked in black, arrows linked in blue, and right line between 2 articles.

NewsTwoColumnOrange

Full date in small text, Title linked in blue, 'more' linked in orange, and right line between 2 articles.

5. Customizing the ItemStyle.xsl for CQWPFrom the top site of the site collection, browse All Site Content or Content and Structure. Open the Style Library. Get into the XSL Styles Sheets folder.

Click the context menu for ItemStyle to Send To and choose Download a copy and save it to your desktop. Please make a backup of the file right away! Versioning is on by default in the Style Library but it’s sometimes easier to quickly upload the original file if things go awry in the XSL editing (don’t worry, you’ll error out at some point!).

A great reference is found in “Customizing the Content Query Web Part XSL” (http://www.microsoft.com/belux/msdn/nl/community/columns/stevenvandecraen/contentquerywebpart.mspx) on MSDN.

1.Open ItemStyle.xsl in any text editor or SPD.

2.At the top alongside the other attributes, add the ddwrt namespace which will handle using dates.

4.Begin to edit the copied template by first changing the name and match attributes to be unique. The name will be selectable in the Item Style section when editing the CQWP.

5.Add a Created variable that will display that date in the web part for each article. This is using the internal column name. How do find out how to get the internal column names? See Heather Solomon’s blog (http://www.heathersolomon.com/blog/articles/CustomItemStyle.aspx) and this MSDN Forum (http://forums.msdn.microsoft.com/en-US/sharepointcustomization/thread/6328a12c-6c15-4c98-a997-e5e7104706c3/) for further details.

•MM/dd/yyyy is an example of the output of the date format. Some options:

MMM dd, yyyy -- Oct 11, 2008

MMMM dd, yyyy -- October 11, 2008

MM/dd/yyyy -- 10/11/2008.

dd/MM/yyyy -- 11/10/2008

6.In our example, we will output the Date above the Title. The title will be linked to the published article page. There will be a brief article synopsis that pulls from the article's Description. Finally, a "more" link that also links to the full article.

The actual output will be contained within the

near the bottom of your copied template. Make your edits within this Div.

7.Wrap the Created variable in its own Div right after the being Div for "link-item". This will be the article date. (Leave the CallPresenceStatusIconTemplate).

Adjust the font styles, colors, alignment and other CSS attributes using HTML tags. Be careful in this section and test after small changes to be sure you don’t crash the page.

8.For the DisplayTitle variable, surround it with the link tag and add in the style attribute to color it any way you choose:

9.The Description will display on a new line, followed by the "more" link and lastly, the end Div tag for the "link-item" class.

10.To output the "more" link in different languages, simply get the translation for the word(s) you need, be it "read more", "full article" or whatever. If the translation has special characters you will need to use the ASCII representation. The easiest method is to paste in the translated text into Microsoft Word, save the page as an HTML page then open up that page in a text editor to get the ASCII code. For "more", the translations are listed below in the following snippets as they should appear in the XSL file:

German

sans-serif;color:#003399;">... mehr

French

sans-serif;color:#003399;">... plus

Italian

sans-serif;color:#003399;">... più

Spanish

sans-serif;color:#003399;">... más

Russian

sans-serif;color:#003399;">... подробнее

Dutch

sans-serif;color:#003399;">... meer

11.Save the file and upload it back to the Style Library, add in comments and check in the file.

12.Open up the site where you placed the imported Content Query Web Part.

13.From the Site Actions menu, click Edit Page.

14.Modify the CQWP and expand Presentation.

15.In the Styles section, leave the Group style as Default and for Item style select your new template name from the XSL file.

I just wanted to reinforce with something I recently worked on - how easy it is to rebrand SharePoint site collections or even just individual sites themselves.

This is a picture of the new IT site of the intranet as part of a communication initiative I was leading in April/May at my last job. The styles were done through an override css, but another thing to notice is that paying attention on the actual page content helps as well.

Update: Anybody interested in branding, please make sure to check out my original post where I explain the details around this branding.

Sunday, September 28, 2008

A few months ago, we were given the task of determining the success our users were having when trying to find documents and web pages through our intranet implementation of Microsoft Office SharePoint Services (MOSS) 2007. We first looked at built-in solutions offered by MOSS; but, all of their analytics followed the ‘what-you-see-is-what-you-get’ paradigm, and none gave a straight-forward answer to the questions we needed answers to.This led us to pursue a custom analytics solution, and after much data probing and testing we ended up leveraging SQL Server Analysis Services (SSAS), ProClarity and ProClarity Dashboard to achieve our goal.

METHOD:

Step 1: Finding the Data

MOSS stores data all over the place in a variety of different formats, so sorting out where the information is that we want to use can take quite a bit of time.Luckily our solution requires no file-parsing and no 3rd party applications; the entire search data is stored in two separate MOSS databases.The databases should be your shared service provider (SSP) content database and your MOSS content database.Our setup used the following two databases, as pictured below:

The database names might differ, but you’re looking for the tables highlighted above. Once you’ve found those, you’re set.

Step 2: Establishing the Data Source View

How MOSS actually uses these tables in any manner other than how I did is a mystery to me, (although some of you more fluent in database design might understand them) but after a couple days of table analysis, I determined how the tables are linked with one another well enough to retrieve the information I desired.All of the search information is inside the tables marked with the prefix ‘MSSQLog,’ so if you wish to examine the data on your own, that’s where you’ll find it.Creating an OLAP cube for use in ProClarity is a powerful analysis technique; so, I structured the data into a typical data warehouse fact-dimension relationship, pictured below:

The table joins above are pretty straightforward (id to id) except for the join between the contextual scope table and the MSSQLog fact table. This join is more complicated is due to the fact that we’re using the Webs table, and by now you might be wondering why we’re even using the Webs table in the first place.

The Webs table provides a listing of every site in your MOSS site hierarchy, providing for each site: a reference to its parent site, the site name, and the URL where it is located on your server.This is extremely useful because all of your search statistics for each site would be consolidated into one all-encompassing ‘main site’, leaving out individual site statistics.If you are looking for that overall consolidation, OLAP cubes allow you to aggregate all the sites together anyway. So really we’re just gaining extra functionality by adding this table in.

The join to the fact table isn’t exactly straightforward because the Webs table actually resides in a different database (as you should know from step 1).Therefore, it’s not going to have any key columns or really anything in common with the fact table except for one column, URL.The URL listed in the fact table is the URL where the search was performed (not what link the person clicked on, although that information is stored in another table), and since the URL listed in the Webs table is the URL of the site, the two should match up.A little bit of string parsing needed to be done on the URL field in the fact table as it had the “http://” prefix (ex. http://MOSSsite.com/engineering/softwareDev/) where as the URL field in the Webs table was just the folder hierarchy (ex. engineering/softwareDev/).

Step 3: Building the Cube

With the data source view created, the next step is to create the OLAP cube to browse data quickly.It’s a fairly basic cube. The only thing I really changed from the default options was that I added in a pre-built time dimension. I thought it would make sense to see data even for days when there weren’t any searches (otherwise if there weren’t any searches for a weekend, for example, the graphical representation of the data would just skip over it and not show zero for the number of searches for those days). Also, in the Site Dimension, I had to add a parent-child recursive hierarchy built on the ‘Parent Web Id’ attribute (as pictured below).You’ll need to add that in to build your site hierarchy, otherwise the dimension will just be a flat list of every site in your MOSS implementation. Other than those two minor changes here are the measures, attributes, and dimensions I ended up with in my cube (your implementation may vary).

To give a few examples of what we can do with all this data:

The number of queries that have occurred

What was queried

What site the search happened at

How long it took before the user clicked on a result

Whether it was a best bet that was clicked

The average time to click for a given site

The number of results the search returned

The time frame on a given search.

Now all of this data is available in the SSAS cube browser, but unless your end user is familiar with how pivot tables and OLAP cubes work, you’ll probably want to wrap it in a more visually friendly interface, which brings us to ProClarity.

Step 4: ProClarity Integration

The next step is to publish your cube to your SSAS server and find it in ProClarity Professional.Now that we have all this search information, how do we answer that initial question, search success?The actual term, ‘search success’ is kind of up to the interpretation of whomever is designing the tool, but I decided that in order to show search success you’d need more than one graph.In fact, I ended up designing about 18 graphs in ProClarity, although not all were used:

You can see that you can get some pretty useful information, such as top 10 queries without best bets. Perhaps even more useful might be the top 10 queries with worst results that don’t have best bets.

After you’re done playing around with your graphs you could stop there and just point your users to the ProClarity graphs (presumably using ProClarity Standard), but for this solution we wanted to surface the graphs through MOSS.We also wanted multiple graphs to show up on a single page as viewing 18 graphs individually might become tedious, so ProClarity Dashboard became the platform to view all this data.

Step 5: ProClarity Dashboard and MOSS Integration

Creating a dashboard is pretty straightforward, and it’s very much up to you how you want to organize your information.For our solution, we organized the dashboard into four separate tabs: Site Search Analysis (for each individual site), Search Analysis (general statistics), Best Bet Analysis (for improving search efficiency), and Summary (general consolidation of the other three tabs).You can see the end result looks very visually appealing:

Dashboard makes the search analytics data easy to understand and very easy to use

The last step is to integrate this into MOSS.Because dashboard is just a simple webpage, we can use a page viewer webpart, and just refer to the dashboard URL.The end result will end up looking similar to this:

Hopefully this has served as a good, general outline to how you can create your own custom MOSS search analytics solution using SSAS, ProClarity and ProClarity Dashboard!

The MOSS intranet that I helped create last year and that is documented in this blog post won awards recently. Woo-Hoo.

TheSource was honored by the Colorado Chapter of the International Association of Business Communicators (IABC) on June 12 at the annual Bronze Quill Awards banquet in Denver.

IABC is a professional association that provides ongoing learning, resources and research to a professional network of more than 15,000 business communication professionals in over 60 countries.

Based on criteria that evaluated the program objectives and the results of implementing the strategy, TheSource was awarded an Award of Excellence in the Company Intranet category. The entries were judged by communications professionals in Seattle and Minneapolis.

The big win of the night came when TheSource was also awarded the coveted Best in Show award. The Best in Show is chosen from the top entries in each of the categories.

The judges praised TheSource for being a “robust, content-rich communications tool,” and they appreciated the team approach that was used to identify need and strategy and decide on objectives.

Congratulations to everyone on the team who helped create the award-winning TheSource!

Monday, May 12, 2008

ChallengeHere is the scenario. MOSS is a great platform to work on but there are some things that are not ideal from a customer facing .com site. I will compare and contrast 2 features in CMS with those in MOSS.

1) The first example would be the term 'Pages' in the URL. Since pages in a publishing site are stored in the 'Pages' library by default, that term appears as part of the external URL. To some eBusiness users, that is not acceptable because 1) the term doesnt mean anything relevant to a spider and 2) it probably hinders the URL rank.

2) CMS 2002 allowed subdomain mapping to top level channels. That means you could have as many subdomains as you wanted on one IIS website which is a best practice for both CMS and MOSS from a performance standpoint (less websites on server = GOOD). However in MOSS 2007 there is no such direct mapping. So the options are a) have separate MOSS web apps for all the subdomains (OOPS) or b) do some fancy URL rewriting. Needless to say, we went with the URL rewriting because we had over 25 subdomains that could increase with time and we did not want to go with that many MOSS web apps for the performance implications that might have.

So as part of the CMS to MOSS migration, we brought over all the subdomains (~20) into one site collection. The reason for this is that the subdomains are very closely tied together and the amount of data was not too large (15 GB). From a maintenance perspective, we can separate these subdomains into different site collections in the future should we decide to. We are also not using the variations feature in MOSS because we dont have exact content mirrors in all of our subdomains. We are also using content deployment to push content from the authoring to the production farm.

So we had 2 significant challenges to overcome. One was to map many subdomains to second level sites in a site collection. The second one was to allow for .htm extensions and get rid of 'Pages' in the URL - for reasons explained below.

SolutionAt this stage, we could go with ISA server/firewall mechanism to meet our URL rewriting needs. The one problem was that we didnt have enough time to test and implement an ISA server solution, not considering the cost of the ISA server itself. The other solution is to use a URL rewriting mechanism along the lines of Apache mod-rewrite for IIS to translate our URLs on the fly. We went with one such third party IIS rewrite solution.

The IIS rewrite rules were setup on a honeypot empty IIS website which contains host header entries for all the sudomains we serve - the IIS redirect acts on them and instead does a reverse proxy to the real MOSS Web application and displays the data. There are additional rules to map the .js, .css, and other files that are loaded on every request. For eg, a request to www.company.com/xyz.htm gets translated internally to ext.company.com/www.company_com/xyz.htm and the content is served back without the link changing in the address bar of the browser - which is a function of the reverse proxy.

The other need was that all our pages were surfaced as .htm in CMS and moving them to .aspx in MOSS would break hundreds of thousands of links, along with including 'Pages' as part of the URL. The EBusiness team also sincerely believed that having 'Pages' in the URL does nothing to help our SEO and probably hinders it. So this was deemed as a showstopper and we had to devise a solution for this challenge. The solution was to use the friendly URL feature offered by 'Rapid For SharePoint' - which is basically a HTTP module. This module takes out the pages/pagename.aspx and instead replaces it with pagename.htm (we can configure the extensions). So a request for auto.company.com/support/pages/default.aspx would instead be translated to auto.company.com/support/default.htm, which is acceptable to theeBusiness team and does not break the links. This module also changes links within the page content to point to filename.htm instead of pages/filename.aspx. We also used their XHTML filter which basically does a regular expression on a pattern and replace it with other text to change relative links to absolute ones in the html fields.

In MOSS, all requests to a site get translated to /pages/default.aspx as that is the default page in the pages library. For example, a request for auto.company.com/support will get translated to auto.company.com/support/pages/default.aspx. MOSS returns a 302 redirect along with the new link to the browser which stops the reverse proxy mechanism and instead changes the url in the browser address. For this I put in a rule in our IIS redirect to translate all requests to sites and subsites by adding the default page name to the request before the request ever reaches the MOSS Web application. This obviates the 302 redirect problem. So a request to http://www.company.com/ is translated to www.company.com/default.htm and is then proxied to the MOSS Website.

There was also a requirement to serve MOSS page links containing .jsp extensions since CMS didn't really care about the extensions (you could request a page with .htm, .jsp or no extensions and it would be served) so there were some .jsp links that had to be served. I achieved this using an IIS mod rewrite rule.

Considerations1. Site page redirects in MOSS all respond with a HTTP 302 which will throw the reverse proxy mechanism off. Hence all the redirect pages will need to specify the final external (SEO friendly) link.

2. All sites need an index page called index.htm, because all requests to sites will be translated to site/index.htm to avoid MOSS sending back a HTTP 302 (the name index is arbitrary, you could use default.htm or any other name). The index page can then be a redirect page to the destined page (with a fully qualified external link redirect link) if need be.

3. XHTML filter needs (working for changing all relative links to complete. For eg. Changing "/auto.company.com to "http://auto.company.com/). This was added for all subdomains.

Long Term StrategyThe long term strategy should be to use ISA server to do the address translation to reduce the load on the MOSS Web Front End servers.

Saturday, March 22, 2008

Recently during our CMS to MOSS upgrade we were having issues with the timer jobs not running on time because of the DST problems (we did not have the DST hotfixes or SP1 applied yet). Some of the problems it caused were:

Content deployment jobs (even on the same server) were timing out - basically they would wait for an hour to run and then time out anyway.

If you go ahead and create new web applications, it takes about an hour to actually provision these Web apps.

Other timer jobs are affected too.

There is a way to fool the timer job by changing the time on your server back one hour so it would actually think that it was time to run. However, the timer job would run in some cases and still not run in others (such as content deployment). We found a way to actually force these timer jobs to run by using the following command.

We could have now installed the DST patch or go down the SP1 route. We did not go the (WSS and MOSS) SP1 route because that hosed our test environment. Also WSS SP1 threw an error and did not install successfully on our staging environment, though the MOSS SP1 installed successfully in staging. This constituted a significant risk in our mind so we decided not to move forward with the SP1 install on production (Needless to say, we had to rebuild our stage and test farms because you cannot just roll back from the SP1 upgrade).

So we decided to move forward with just the patch (for now) for fixing the timer jobs problem. This worked fine for our test and staging environments, which had minimal data because we had to rebuild them and did not have time to reattach all the content databases from production yet. On production however, the patch threw an error and about 5% of the content was missing - though the Web apps loaded fine. Interestingly enough, we migrated from SharePoint 2003 to MOSS 2007 last year and all the migrated content appeared to be there. The new sites that we had created on MOSS were missing however.

So we decided to attach the backed up databases from the night before thinking that MOSS probably stored all the configuration for the patch in the DB - and going back one night would bring us back to the night before we applied the patch. Unfortunately upon attaching those databases the apps did not work and we got this error in the event log,

The schema version (3.0.149.0) of the database SharePoint_AdminContent_711c9d8b-17ed-404c-987a-708e0e059b12 on DBSERVERNAME is not consistent with the expected database schema version (3.0.151.0) on WEBSERVERNAME. Connections to this database from this server have been blocked to avoid data loss. Upgrade the web front end or the content database to ensure that these versions match.

At this point we had two options, since the server appeared to be hosed - we could rebuild the production server or restore the image backup of the entire server (bare metal restore) as defined in the SLA with our backup provider. Both alternatives were not rosy so we decided to tinker a little bit. So we looked around and found a blog that talked about this problem. Thanks to Adlai Maschiach's blog that helped us get past this glitch.

TODO: Now ee will go back to the drawing board and find a way to upgrade the production farm again. I will post those experiences here shortly.

Saturday, February 9, 2008

We recently had a need to find out how documents were being accessed on our SharePoint 2007 intranet. This need was again tied back to the original intent ofminimizing user clicks to important imformation. Hence we had to figure out the hit counts of all documents on our intranet and then organize the more accessed documents corporate wide on a page that would organize this information based on our classification and thus provide quick access to employees.

SharePoint OOB does not provide you with these level of reports across all your site collections. My friend Travis wrote up a nifty little SQL script that would do the trick. Enjoy!!

DECLARE @table TABLE(DocName varchar(4000), HitCount bigint)

DECLARE @doc varchar(4000)DECLARE @hits bigint

DECLARE c_docs CURSOR FOR SELECT DISTINCT DocName FROM [MOSS_SSP_DB_Name_Here].[dbo].[ANLSiteResourceHits] WHERE ([DocName] NOT LIKE '%.aspx') and ([DocName] NOT LIKE '%.html') and ([DocName] NOT LIKE '%.asp') and ([DocName] NOT LIKE '%.htm') and ([DocName] NOT LIKE '%.xml') and ([DocName] NOT LIKE '%.xsd') and ([DocName] NOT LIKE '%.one') and ([DocName] NOT LIKE '%.xsn')

Tuesday, January 8, 2008

ASP.NET 2.0 and MOSS 2007 provide great navigation controls, but the html that is emitted when those navigation controls are rendered is not the cleanest (for example, view the source of a MOSS page containing a menu and a treeview and you will see what I am talking about). The menu and treeview controls emit a bunch of table tags, which is not a best practice. There is a desire to control the output of these controls to make your pages more css based. It would be a shame to have to write your own menu control to control its HTML output.

ASP.NET's control adapters come to the rescue. When these are configured correctly, they can convert the table tags emitted by these controls to ul and li tags. I recently tried this out for making our menu control more css based and it worked wonderfully. The caveat is that this will change your css styles for the menu totally. Try it out, this works great and making your pages more css based is definitely a best practice.

So a few days ago I started playing with and testing the MOSS content deployment feature. From my previous readings, it seemed easy enough. My approach was to go at it from scratch so that I could understand what works and what doesn't. I got quite a few errors along the way, I wanted to publish those here so that you could learn from my mistakes. As I mentioned in my previous post, we will use a 3 stage topology for content deployment. I still have some other questions around this whole process, but I will update this post as I learn more.

The first scenario I tried was moving from one site collection to another in the same Web application. That did not work out so well. I got errors along the lines of "Unable to import the folder _catalogs/masterpage/Forms/Page Layout. There is already an object with the Id 853c8232-ae6d-4626-9cae-682xxxxxx in the database from another site collection." The other error I got was "Unable to import the folder WorkflowTasks/Office SharePoint Server Workflow Task. There is already an object with the Id 3d27c6ef-cc9c-4de5-b671-xxxxxxxxxxxx in the database from another site collection.". Supposedly this error occurs when the site collections share the database and an object already exists in the database due to the first site collection that the content is being exported from. The way I got rid of this error was to create a new Web application and move content between site collections that are in different Web applications.

Moving between site collections in different Web Applications did not work initially either because the destination site collection was based on a template other than the blank template. It is imperative that the destination site collection be based on the blank template for content deployment to work. The error I got was "Content deployment job 'Remote import job for job with sourceID = e94ecf30-33d2-498d-ae5c-xxxxxxxxxxxx' failed.The exception thrown was 'Microsoft.SharePoint.SPException' : 'Cannot import site. The exported site is based on the template XYZ but the destination site is based on the template ABC. You can import sites only into sites that are based on same template as the exported site.'". By the way, the error was confusing because it did not mention that the destination site needs to be a blank site.

The other task I had to complete was to disable all the features on the target site collection. For some reason that was causing a problem. More on this later, but keep in mind that you will have to probably deactivate the features on your destination site collection.

After these corrections, the content deployment between site collections in different Web applications worked. The next test was to move the content between farms in different domains :).

Initially this did not work and failed with the following error:"Content deployment job '[jobname]' failed. The remote upload Web request failed.". The DNS was setup correctly (using the HOSTS file) and I could browse to the destination site from a browser on the source server. I looked in the event log for an explanation of the error and found the following."Failed to communicate with destination server for Content Deployment job '[jobname]'. Exception was: 'System.Net.WebException: Unable to connect to the remote server ---> System.Net.Sockets.SocketException: No connection could be made because the target machine actively refused it".This didn't make a lot of sense right away but then I realized the problem. In the target farm, we had one app server, 2 WFEs and 1 DB server. The account that was being used to authenticate against the central administration of the target server in "Content Deployment Settings" from the source did not have permissions in the destination site collection. I gave that account permissions to the site collection and voila..it worked.