Wednesday, November 24, 2010

If you are reading this post, please read through my previous post on this topic to get some background on tricks to solving the "Could not find Feature" issue with built-in SharePoint features.

If you are still reading, then you have probably built and deployed custom features to your farm, such as a web part. You are then running into problems during Content Deployment jobs between farms that have different SharePoint versions on them, such as deploying from Enterprise to Standard.

The solution that worked for the built-in features was appropriate because the features that were causing the problems didn't need to be installed on your destination farm. But with custom features, that is not the case. You built them precisely because you need to be able to run them on the destination farm.

I encountered this same problem and it took me a little while to figure out why it was unhappy. Here are the steps I'd suggest you follow:

Check the 14-hive on the destination Web Front End to verify your feature is there.

Verify that your feature is installed in the destination site. For instance, check the Site Features page for features scoped to the Site level

Try enabling the feature and making sure it doesn't thrown any errors

Maybe you've done all those things and the feature looks good, but it's still not working? At this point, you should be confused. I know I was. I decided to check the settings on my package in Visual Studio. The Deployment Server Type was set to WebFrontEnd, which was correct. I then checked the 14-hive on the destination App Server, which is where Content Deployment jobs are run. No feature folder there.

Now I know what you're thinking - isn't that what we want? Solutions that are targeted as WebFrontEnd are marked that way for a reason - they don't need to be on the App Server because it's not serving up those web parts. Well, when it comes to Content Deployment, apparently they DO need to be there. I assume this is because the Content Deployment job is going through Central Administration, which is living on your App Server.

The solution I came up with was to enable the Microsoft SharePoint Foundation Web Application service on my App Server, effectively making it a Web Front End as well. This ensures that when I add solutions to my farm that the features get deployed to my App Server as well, effectively squashing the Content Deployment job errors I was receiving.

My App Server is already segmented off from the Web Front Ends by a firewall to keep it from being accessible from the internet. However, to ensure that my App Server is never used as a Web Front End, I am also making sure it is never listed in the load balancer and I am blocking traffic to port 80 on that box. This means there should be no real impact to the machine and my Content Deployment jobs can now run successfully.

Lately I've been working a lot with Publishing sites, which means I've been using Content Deployment jobs to move content between my farms. Unfortunately, I've learned the hard way that this part of SharePoint is rather particular about farm setups.

I have an Enterprise development farm that I use for consulting work. I built some pages on it and then wanted to deploy them to my customer's test farm, which runs a Standard license. This broke on me every single time. Specifically, I kept getting errors that features didn't exist on my destination farm.

One message I got was "Could not find feature IPFSSiteFeatures". Here are the screenshots from my deployment report:

These are all the features that existed on my Enterprise development farm but not on my Standard test farm:

IPFSSiteFeatures

WACustomReports

PPSWorkspaceCtype

PPSMonDatasourceCtype

PPSWebParts

PPSSiteCollectionMaster

To get my content deployment working again, I enabled them temporarily on the destination farm, performed the deployment, and then disabled them again. You might also be able to disable/uninstall those features on your Enterprise farm before doing the content deployment job and achieve success, but I didn't test that option. Note: Make sure you disable these features on your destination farm after deployment, or you may be in violation of your SharePoint license.

If you made the mistake of installing Excel Services and PerformancePoint Services on your Enterprise environment, your problems are more difficult. If you run Get-SPFeature, you will see more items in the list that will cause your content deployment to fail, such as BizAppsListTemplates.

You might be tempted to disable them on your source farm. Unfortunately, when I tried, I learned you cannot disable them with PowerShell because they are still in use. However, those features aren't activated on any of my sites. Because of my failure to disable them on the source farm, I did not test enabling them on the destination farm because I wasn't sure I'd be able to get rid of them and I didn't want to have to rebuild the entire farm again and lose my content.

I believe these features might be activated inside the service application itself, but I have not found a way to confirm this theory. I tried deleting the Service Application, but the features still couldn't be removed. If you figure this out, let me know.

Summary

In short, here are the things to keep in mind if you want to do a Content Deployment job between two farms:

Source and Destination farms should be on the same version of SharePoint

You must deploy solutions that exist on your source farm to your destination farm prior to running the job.

Features that are enabled on your source farm will automatically get enabled on your destination farm, but if you encounter an error with a feature, verify that the feature is installed on the destination

There are some additional quirks if you receive the "Could not find feature X" for custom features that you have developed. I'll cover those in a followup post.

Wednesday, October 6, 2010

SharePoint 2010 allows inline editing of your HTML on Publishing pages, which is a nice boost to productivity. However, there are some quirks with this feature that can really trip you up if you aren't ready for them.

Here's a scenario I ran into this week. I have a simple Publishing Page with a Content Editor Web Part on it. Inside this web part, there is an HTML element that I target with some jQuery to display tooltips. The JavaScript here is really quite simple and just dynamically adds a relative positioned div to the page above the target element.

The code for this is tested and works fine outside of SharePoint. But when you edit a page, you'll find an interesting side-effect. Specially, when editing the page, the JavaScript that adds the dynamic elements to the DOM still runs. When the page is saved, even if you didn't even edit that particular Content Editor Web Part, the updated DOM elements get saved - including the tooltip changes!

I have tested this with a lot of scenarios, but it's pretty consistent across any SharePoint components that provide inline editing.

So, how do you allow JavaScript to dynamically update content on publishing pages? This is really going to depend on the nature of your content. Here are a few different scenarios I've employed in my site:

Extract the dynamic content and make it a web part.

This works well if your content doesn't change that often and has few properties. A great example here would be having a web part that uses swfobject.js to load a flash file rather than placing the script call inside a Content Editor

Skip the JavaScript calls that update your page when in edit or design mode.

This works really well for scenarios like my tooltip. You can do this either in JavaScript or C#, depending on how your page works.

Here is an example of checking your page mode with JavaScript, using jQuery:

Saturday, July 31, 2010

While trying to publish a workbook to SharePoint 2010 to use with Excel Services, I ran into some problems. Specifically, it appeared that Excel wasn't able to access my Document library.

To publish my workbook, I followed instructions you might find anywhere on the web, using the Office 2010 Backstage to save to SharePoint. However, none of my libraries showed up by default. Not a problem! I can just type the URL into the address bar, right? Apparently Excel says that it "can't open this location using this program".

Generally problems like this are permissions related. Well, I was logged into the machine as my farm administrator, who has full rights to the Documents library. I was also able to click on files from SharePoint and have them open Excel automatically.

I decided to take a new approach. I opened the demo Excel workbook that came with the Business Intelligence Center template. I then made a small update and saved it back to SharePoint. Now when I opened Backstage to save to SharePoint, the Documents library showed up in the Recent Locations section.

I was hoping by doing this, I could trick Excel into finding the location. So I then tried to save my new workbook by just clicking on that recent location. However, Excel was still not able to do it, and told me it was unable to open my site:

After looking around on the web, I found a forum post that seemed promising, as I'm using a Windows Server 2008 R2 machine. I opened Server Manager and installed the Desktop Experience, which required me to install the Ink and Handwriting Services as a prerequisite. I was then prompted to reboot upon completion. After rebooting, I attempted to save my workbook to SharePoint again and it worked.

I've been playing a lot with Excel Services for the last week and while it is nice, it is also temperamental. Most of this can be chalked up to inexperience on my part as I discover the closest to least privileges you can get for a SharePoint 2010 Excel Services Service Application, in light of the ever popular bug, "The workbook cannot be opened". However, I'm also convinced it's a little more particular than the previous version.

By default when I installed Excel Services using a PowerShell script, it had an entry for a Trusted File Location at "http://". After playing with the Business Intelligence Center template for a bit, I created my own workbook that had a PivotTable and a PivotChart that talked to my SSAS installation. Naturally I decided to update my Trusted File Location settings. So I deleted the default entry and created a more specific one that pointed directly to the Documents library that came with the template. Then I checked on the Excel example that came with the template and found that Excel Services was no longer working because it was "Unable to process the request".

Event Viewer showed nonstop critical errors and ULS had some gems in there pinning the blame on the Secure Store Service: "Request for security token failed with exception". I tried refreshing the key figuring that my WFE and App server were out of sync, but that didn't fix the problem. Lacking a better idea, I did an IIS reset on the WFE - and it started working again.

I'm not sure why it was unhappy, but at least it was only a 10 min troubleshooting span and a simple fix!

Monday, July 26, 2010

Recently, while trying to experiment with Excel Services in SharePoint 2010, I decided to remove the service and do a reinstall with a PowerShell script. Since I like my scripts to create managed accounts as well, I removed the account that was running my service. Apparently, SharePoint didn't like this and I received an error stating that my SPManagedAccount could not be deleted because other objects depend on it when I loaded the Managed Accounts page in Central Administration. Well, not being able to load the page sort of limits my options for correcting the problem, doesn't it, Microsoft?

I double checked and the service application had been removed from SharePoint, the Excel Calculation Service had been stopped on the App Server, and the ApplicationPool had even been removed from IIS. Looking up the CorrelationID in the logs also didn't tell me very much.

I decided to pop open PowerShell and see what I could pull off. Turns out it was a rather simple fix. First, I looked up my Managed Account. I then tried to remove it and found out the dependency was an application pool. After removing the dependency I was about to remove the Managed Account! The PowerShell Commands to do this were:

Thursday, June 17, 2010

Today I was building a web part for SharePoint 2010 and ran across an interesting problem. The web part was working fine for authenticated users, but was failing for anonymous users. My first thought was that Lockdown might have been the problem, but it turned out after looking through SharePoint log files that I was running into a dispose problem:

Trying to use an SPWeb object that has been closed or disposed and is no longer valid.

It's an interesting error message, considering that I was disposing of my object, but only after I'd finished with it. I believe the problem wasn't that my code was using a disposed SPWeb object, but rather that I had closed an SPWeb that SharePoint was relying upon.

Here was the original code, which, according to Best Practices, was closing objects that don't need to be closed:

Thursday, June 3, 2010

It's a fairly common requirement when building a public facing SharePoint site to make sure all pages share branding elements. This includes the SharePoint error pages. While working on a publishing site, you've probably encountered an article that touches on one of the error pages, but for some reason I haven't seen one that tries to cover all of them. So hopefully this will do the job!

Error Pages

If you are just looking to control the basic error pages, then look no further than the SPWebApplication.SPCustomPage enumeration, which provides a list of commonly updated pages. You use the SPWebApplication.UpdateMappedPage() method to set which pages SharePoint should serve up for each of the values within the enumeration. For a good example of this, see this post.

HTTP 404
But what about other HTTP status codes? Well the easiest would be the 404. SharePoint already has support for a custom 404 page, as long as it is pure HTML. The default 404 is located in 14\TEMPLATE\LAYOUTS\1033\sps404.html. The quickest way to override this is to use a feature to deploy your own HTML page into the same folder. A feature receiver can then update the SPWebApplication.FileNotFound property with a relative path to your file. There are just a couple points to keep in mind when you do this:

Your custom page should be greater than 512 bytes. This is because the default feature in Internet Explorer to show "friendly" error messages will sometimes ignore pages smaller than this. You can read more about this problem at Microsoft Support.

Your custom page, if encoded as UTF-8, should not have a BOM. Even if you save your file as UTF-8 without BOM, editing it later in Visual Studio will add the BOM back, so be careful. You can read more about that problem here.

If you'd like to have a custom 404 that performs server side code, such as to include web parts, you'll need to get a little more creative. There are two common ways to handle this. The first would be to have a static html page that performs a client side redirect with either a META tag or javascript. If you'd like to see that approach in action, here is an example.

A more robust solution would be to use an HttpModule. With the HttpModule, you have two more possibilities for how you serve your page. You can use a Response.Redirect, which is what the linked article suggests. The only downside to this is that the URL changes, which may or may not be desired. Alternatively, to keep the URL, you would need to use Response.Write.

HTTP 401
Now that you have custom error pages for the SPCustomPage enumered pages as well as a 404 page. But what about a 401? The simplest way you might think to do this would be to edit the web.config CustomErrors element. In any normal ASP.NET application, that would have been sufficient. Unfortunately it doesn't work in SharePoint.

Your next instinct might be to try IIS, as it provides the ability to set custom error pages. If you edit the property for the 401.2 status code and point it to a custom HTML page on your site, it may or may not work. In testing, it turns out that having Anonymous Access enabled in SharePoint (which then sets it in IIS) prevents a custom 401.2 page from being used. However, if Anonymous Access is disabled, such as with an Intranet site, then the custom page will show just fine. Since we are talking about a public facing deployment you are probably using the Publishing template, so you're going to need a different approach.

The trick for a custom 401 message when you have Anonymous Access enabled is using a custom http module, much like you could have done for the 404.

Once you hook that up in your web.config, go ahead and try to load http://YourSite/_layouts/settings.aspx. You should get the same authentication prompt you'd expect for an anonymous user, but if you hit cancel you'll now see your custom 401 message. Note that the way you register an HttpModule in 2010 is slightly different. If you add the module to the httpModules element in web.config, it may not work. In testing, I was only able to get my module to work by adding it to the modules element with the other SharePoint HttpModules.

As a final level of customization, you could consider having no prompt at all on your public site when users request restricted content. The way to do this is to extend the site and have a private site as well. Since we are just talking about two different web sites in IIS (with corresponding settings and web.config files), you can control their authentication and error settings independently while still serving up the same content. The public site could then have Anonymous Access enabled and Windows Authentication disabled. This removes the login prompt when a 401 occurs on the public site. The private site would have Anonymous Access disabled but Windows Authentication enabled, allowing users to log in and manage content as needed.

Quirks
Since you are probably using the Publishing template, it's worth mentioning the Lockdown feature. This feature stops users from being able to see your Form pages. It also locks down lists and libraries, such as the Style Library. So make sure if you use CSS in your custom error pages that you know whether you have lockdown enabled and plan locations for your assets appropriately.

Wednesday, May 26, 2010

I've been working with Content Deployment jobs lately, and noticed that in 2010, the section on the Create Jobs page where you can specify whether to perform an incremental or full content deployment was missing. Just to make sure I wasn't crazy, I looked up the old screen from 2007, which looks like this:

As you can see, there is a section there for setting the types of content that are deployed. The first option, "Deploy only new, changed, or deleted content" is the incremental deployment. The second option, "Deploy all content, including content that has been deployed before", is the full deployment.

Now compare that screen to the 2010 equivalent:

For some reason, the Deployment Options section is missing. This is not gonna work. If you recall from 2007, doing a full deployment after the initial deployment can cause problems. See this article for more information about why.

You can't do it through Central Administration, but you can specify whether you want to do a full or incremental deployment using PowerShell. Go to your SharePoint PowerShell prompt and type get-help new-spcontentdeploymentjob -detailed and then take a look at the IncrementalEnabled parameter.

Here is a simple script to create an incremental content deployment job, assuming you have already defined a content deployment path of "Authoring to Production":

Looking in Central Administration, you'll see your created job, but you still can't see if it's incremental or full. To do that, run Get-SPContentDeploymentJob "Authoring to Production - Incremental". You should notice the following line in your output:

ExportMethodType : ExportChanges

That sounds awfully like an incremental doesn't it? Let's try setting the IncrementalEnabled parameter to false and creating another job:

Now run Get-SPContentDeploymentJob "Authoring to Production - Full" and you should see a slightly different export method type:

ExportMethodType : ExportAll

So, this means that your Content Deployment job is going to be incremental by default. The only way to get a full job is to create it via PowerShell. Given the limitations and caveats with full jobs, this seems like a good change.

Thursday, May 6, 2010

I ran into a problem recently trying to use Embedded Resources with SharePoint 2010 web parts. I was using code I had directly used in non SharePoint ASP.Net server controls, which if you aren't familiar with, is pretty simple. I'll briefly describe how you'd use an embedded resource, then describe my problem and how I fixed it.

First, you need to just add your resource, which in my case was a stylesheet, to your project. Then, using the properties pane, set the Build Action to Embedded Resource. Once you've done that, you'll need to edit the AssemblyInfo.cs file within your project and enable the embedded resource.

Now you have a resource that will be embedded in your DLL. The path to that resource is specified in the WebResourceAttribute above. I usually compile and open my DLL in reflector at this point to confirm that the path I've picked is correct. Once it is, you can include your stylesheet like this:

In theory, that should work, right? It works in server controls, and it's worked for me in older web parts for MOSS 2007. But, when I built my 2010 web part, it wasn't working.

I viewed source on my page and could see that the embedded resource was added to the markup. But when I tried to load that URL myself, I got an error, implying my resource wasn't there at all! I checked and rechecked my path inside the DLL. Everything was right.

But then I noticed something different about 2010 web parts. They use a LoadControl mechanism. I was accessing my embedded resources, using the above code, directly inside the webpart.ascx.cs, not the webpart.cs file. On a hunch, I moved the code into the webpart.cs file. Success!

This got me curious, so I looked at the source for the generated HTML and noticed that the generated URL for the embedded resource had changed. Apparently, the path is based somewhat on that first parameter to GetWebResourceUrl(), which is a type. Tinkering a little more, I learned that I could actually leverage the embedded resource from the webpart.ascx.cs by making a minor adjustment.

Tuesday, May 4, 2010

When working with relational lists in SharePoint 2010, you have the option to use LINQ to SharePoint or CAML to join those lists to pull data out. While LINQ is easier to use and will leverage CAML under the covers, it is not always capable of performing queries that CAML can directly.

For instance. Let's say you have a list, called Parent. This list has a single valued lookup column, PrimaryChild, that refers to the Children list. LINQ can very easily perform a query with a where clause based on PrimaryChild:

However, if you were to create a second lookup column, OtherChildren, that allowed multiple values, LINQ would run into difficulties because the lookup is now represented as an EntitySet. With a single valued lookup field, you would instead have just a strongly typed object, with direct access to the fields within that list item.

Running this query will throw an exception that semi-efficient queries are not allowed. If you recall, this is because LINQ leverages Two-Stage Queries. So, you can continue with LINQ and perform the query in two stages, or you can change the query to use CAML.

Here is the same query in CAML, but this time, the query will not cause an exception.

You'll notice that in the example above I'm performing a CAML join and also leveraging Projected Fields. Here are some guidelines/rules to keep in mind:

The CAML has several attributes that ask for the list name or list alias. This is NOT the actual name of the list. Rather, it is the internal name of the lookup field within your list. So in our example, the list was named Children, and the field was OtherChildren. We used OtherChildren to build the join and projected fields.

Projected Fields used in your query do not have to match up to the Projected Fields you have specified in the Child list. Those are a UI convenience and not used by your CAML.

If you want to display a value from the Child list, you need to make sure you have a Projected Field in your CAML. Only those fields which are projected are eligible to be View fields.

All Projected Fields will become SPFieldLookupValue objects (or perhaps SPFieldLookupValueCollection, though I haven't yet had one of my lists do this). Those lookups will always contain the ID of the child list item, but the value of the selected field within that child item.

Thursday, April 29, 2010

I have a custom scope I've created that I am querying against with the FullTextSqlQuery object. Recently I was asked if we could allow a phonetic search. With all the hype around this new feature in SharePoint 2010, I thought it would be very easy.

I looked, and sure enough, there is an EnablePhonetic property right on FullTextSqlQuery. I set it to true, ran my search and got back...nothing. I figured maybe it didn't like my query, which had a LIKES keyword. That wasn't it. I tried looking on the web and it seems all the mentions of phonetic search seem to be closer to press releases than coding snippets.

For FAST Search Server 2010 for SharePoint, this property is only applicable for People Search.

Well, I'm not using FAST, but I am using a custom search scope. It would appear that might be the limiting factor here. While I am searching for people, they are stored in a custom list, since they are not members of my SharePoint site. Therefore I was using a custom scope to get to them.

It would seem phonetic search is not available yet to customize in this way.

Wednesday, April 28, 2010

If you are using the Managed Metadata Service and developing a custom web part, you may be interested in using that nice term picker that you see when you create a new list item that contains a Managed Metadata Column.

First, let's cover the bare minimum code you'll need to get the term picker to show up in your web part. First you'll want to add a reference to Microsoft.SharePoint.Taxonomy to your project. Then you'll need a Register directive in your ascx, like this:

Now you can add the picker to your web part just like any other control.

You'll notice that I didn't set any properties on the TaxonomyWebTaggingControl. Well there are three properties you'll need to set, but we'll do it programatically. Those properties are:

SSPList - This property parses your string and builds a List to set the SspId property

GroupId - A Guid

TermSetList - This property parses your string and builds a List to set the TermSetId property

Now, you can set these properties declaratively if you wish, but it won't be a very portable web part, as the Guids aren't part of the import CSV format for the Managed Metadata Service, nor do any of the Taxonomy Create methods allow setting of Ids. Because of this, it's best to have your web part set these properties on the TaxonomyWebTaggingControl programatically. If you just want to see how you would go about setting these properties declaratively, scroll to the bottom of this post, as I've included it for reference.

In order to make your web part portable, you're going to need to query the Managed Metadata Service and get these values yourself. Moreover, you're going to have to do it on every load of your web part since these properties are not stored anywhere between page loads.

Note: If you only set these properties on page load, this will work as long as there are no scenarios where you need to redraw the control again, such as having a custom server validator. In that instance, your picker would work before the postback, but the look-ahead feature would be broken after the postback.

In order to make loading our properties a little easier, here is an extension method for TaxonomyWebTaggingControl that allows you to quickly set the properties. The path part of this was inspired by PHolpar. Also, remember that extension methods must be defined in static classes.

Tuesday, April 27, 2010

I had a need recently to perform a query with FullTextSqlQuery but to limit the results and then display them in a random order. The FullTextSqlQuery class does support a RowLimit property, though this would not work in my case because I didn't want to get the exact same answers everytime.

The PerformSearch method is just a standard setup for using FullTextSqlQuery to return a DataTable. Note that I limit my resultset as much as I can with the FullTextSqlQuery.QueryText property to try to avoid hitting a throttling exception.

With the limited results returned, I then want to randomize the order of the records. LINQ allows us to do this the same way we would in SQL, and I just order by a random Guid. Once the results are randomized, we just grab the number of records we need with the Take() extension method.

Friday, April 23, 2010

I've been working on importing a set of Terms into my Managed Metadata Term Store from a 3rd party database. However, I ran into a snag. When I execute the following code, I get an error, "There is already a term with the same default label and parent term."

I attached my debugger and found out that even though I was checking for my term, the code would say it wasn't there, even though it was! The problem, was that the specific term causing me problems had an ampersand. The term name I supplied was "Foo & Bar", but the value put into the TermStore actually contained unicode version of the ampersand.

The name value will be normailized to trim consecutive spaces into one and replace the & character with the wide character version of the character (\uFF06). The leading and trailing spaces will be trimmed. It must be non-empty and cannot exceed 255 characters, and cannot contain any of the following characters ; "<>|&tab.

That was indeed the behavior I was seeing. Thinking I had just found a limitation of the TermSet.Terms collection, I changed my code to this:

I ran again, and this time instead of blowing up on just that one case, my code went crazy trying to insert a bunch of different terms that were working fine before and throwing way more exceptions. A little research on this method led me to this post, which suggests the TermSet.GetTerms method does not work in Beta 2, which seems to be what I just discovered as well.

I decided to explore the reference to normalizing term names from the MSDN link. My final pass at the code became:

Monday, April 19, 2010

Now that I can effectively use LINQ to access my Managed Metadata Columns, I'd like to only pull back those columns that contain values I need. For single valued Managed Metadata Columns, this is very straightforward:

For multi valued Managed Metadata Columns, my first attempt was a bust. I tried the following expression, but received a compiler error, "An expression tree may not contain an anonymous method expression".

Now I can query specifically for list items that use specific terms. Keep in mind that this filtering does not happen until after the list items have been pulled down, so you may want to do some additional filtering to narrow the results so you don't run afoul of the new Throttling feature of SharePoint 2010.

Here was the CAML generated by the last LINQ query, which shows that the additional filtering for Term was done after the records were pulled.

I tried another pass on my parameters file. This time I used the following XML.

On my typed list item object, I now have the one extra Column that I specified in my parameters file. LINQ will create a property for this field that is just an object. However, if you access this property, you can cast it to a TaxonomyFieldValueCollection or TaxonomyFieldValue, as appropriate. This provides exactly the information we wanted:

Today I am playing with LINQ to SharePoint. I created a simple list and added a Managed Metadata Column to my list. I then used SPMetal to generate a DataContext class. My Managed Metadata Column is nowhere to be found. I looked through options for SPMetal to see if maybe it just needed a switch to capture those Managed Metadata Columns, but I don't see one.

So it appears that Managed Metadata Columns have no support in LINQ to SharePoint. They are also not eligible to be Projected Fields. It would seem Microsoft didn't really flesh out all the ways the new Managed Metadata Service might be used.

Thursday, April 15, 2010

Today I needed to create some list items programatically for a list that contained Managed Metadata columns. Since Managed Metadata columns are specialized SPLookups under the covers, it's a little more difficult to set your field value than just assigning the text value of your Term.

Here is a method that will set the value for you, and an example of how to use it. Note that in my example the Managed Metadata column is multi-value, even though the example method is only set up to add a single Term. If you have a single-value Managed Metadata column, you can leave out the TaxonomyFieldValueCollection.

Earlier this week I was having problems with my SharePoint logs being empty. At the time, the only fix I could discover was giving my AppPool identity Administrator rights on the machine. Obviously the reason the account lacked such permissions was because I had given it no rights when I created it - the same way I would have when I install MOSS 2007.

Well giving Administrator rights to your AppPool identities in SharePoint 2010 is also a bad thing, as shown by the new Health Monitoring:

However, at this time, I can't seem to find what permission set is needed for ULS to be accessible by my AppPool account. Right now my choices are to upset Health Monitoring or to have logs. Having both is apparently a luxury.

Wednesday, April 14, 2010

I have been working on building some custom Search components, which will leverage my own Managed Properties. However, early on in the process I hit a snag.

I created some sample content and then running a full index of my farm. SharePoint was able to discover some new Crawled Properties during this process, which I was hoping to turn into Managed Properies. However, when clicking on any of the new Crawled Properties, I get an error "Unable to cast object of type 'System.DBNull' to type 'System.String'":

I then opened up Reflector to find what was going on in that method. It turns out to be a very simple method that just calls a stored procedure. I fired up SQL Server Profiler and tracked down this call, which ultimately was breaking the page:

I've been noticing since my install that SharePoint 2010 seems to not be logging much. It was logging some, but most of the logs were very small. Contrast this with SharePoint 2007 where you could easily generated several hundred kilobytes of logging within a few minute span and you can tell something is wrong.

I tried a lot of things to get to the root of my problem. The question I was trying to answer was why sometimes I'd get logs, and other times not. And why were none of my Correlation IDs found? I tried a lot of different things, including:

While doing this, I started reading the log entries I was receiving and noticed something. Each of the entries listed the process that was adding the entry. I saw entries from the following processes:

STSADM.EXE

PowerShell.exe

wsstracing.exe

vssphost4.exe

psconfigui.exe

Surprisingly, I didn't see a single entry for w3wp.exe. I decided to check permissions for the LOGS folder. The user who runs my AppPool, sp_Farm looks like he has almost every permission there is, and should be able to do almost anything to the log files. It didn't make sense.

In frustration I put all my AppPool accounts in the Administrators group and rebooted the machine. Suddenly my logs are going wild.

I'm not entirely sure why my AppPool accounts couldn't write to the logs when they are already in WSS_ADMIN_WPG, but there is clearly some missing permission somewhere. I'll have to break out procmon when I have some free time to figure out what the permission was missing.

Tuesday, April 13, 2010

I'm still waiting on some new RAM to arrive in the mail, so currently my development machine for SharePoint 2010 is a Core2Duo 2Ghz laptop running a VM with 2Gigs of RAM. Needless to say, I've gotten used to looking at this:

I have a feature that provisions a few Site Columns and a Content Type that will use those Site Columns. This was easy enough to achieve with some XML packaged into a Feature. However, my Content Type also needed to have a Managed Metadata column, which doesn't work with XML.

The reasons you can't do a Managed Metadata Site Column through XML are the same reasons Lookup columns don't work - the GUIDs that refer to the underlying objects change between environments, so you can't hardcode them into the XML. If you wanted to write the XML by hand to do this and already knew the GUIDs, it would of course work, but that's more effort than it's worth for such a limited solution.

My first thought when trying this was that I knew when I had exported my Content Type that I created through the GUI I found out that my Managed Metadata Site Column as provisioned ended up being several different SPFields under the covers. So in my first pass, I tried to create all of these fields. That ended up breaking my site because I was unable to remove all the SPFields/SPContentTypes after the fact!

Well, it turns out the answer isn't so hard after you try all the wrong ways first. I ended up having my Feature that provisions the base Site Columns and Content Type via XML. I then created a Feature Receiver that creates the Managed Metadata Site Column and then adds it to the Content Type. Here is the code:

I ran into an issue yesterday that I haven't completely figured out just yet. Specifically, I had a feature receiver that was querying a Term Store to create a Site Column. Unfortunately, it kept failing. When I attached a debugger to the process it eventually showed me that TaxonomySession.TermStores was empty.

After playing with it a bit, I finally got it to work by opening up Central Administration and going to the Managed Metadata Service page to "warm" up the Taxonomy service. I'm not sure if it was a web service time out issue or not, but that seems likely. I believe that the worker process powering the Taxonomy web service was taking too long to spin up on my resource starved laptop, which was causing TaxonomySession to give up on loading any TermStores.

Unfortunately, ULS shut itself down for some reason during this period, and I was unable to find any logging event that would give me more insight.

Today I decided to create a Content Type for SharePoint 2010. As far as I can tell, while there has been a lot of effort in building in taxonomy improvements, such as tagging, the Content Type is still alive and well and works the same.

The easiest way to create a Content Type feature in MOSS 2007 was to create it in the UI first and then extract it. Normally I use a tool for that, but decided this time I'd give it a shot and write code to do it myself. In the process I could see if the API had changed in this area. It appears to work the same, so experience in MOSS 2007 will apply directly.

Here is the code if you want to try it yourself. The last line will be the contents of your Elements.xml file.

In learning SharePoint 2010, I'm trying to take a minimalist approach, and only install services that I need. The problem is I don't yet know what those services are!

So far, on my new installation I have a Managed Metadata Service and a Search Service Application. Thinking to test Search, I edited one of the default pages and then tried to publish it and got the following message:

I suppose that the State Service needs to go on my list of required services. So, I went to install it from the Manage Service Applications page. For some reason, however, it was not listed!

Of course, given my problems with the Managed Metadata Service, I now know that the true magic in SharePoint comes from the Farm Configuration Wizard. I opened it up and was not disappointed when I found this guy:

Which leaves me with the following now running in my environment, and publishing workflows that can display properly.

While exploring SharePoint I often read through Microsoft's code. With 2010, they added a lot of popup windows, and it's not always obvious which page is actually building the content of those popup windows.

In this particular case, I wanted to know what the page was when I was trying to create a new Service Application. If you click the New button in the Ribbon, you'll get a nice flyout menu with a bunch of services to choose from. But where do they go?

While playing around trying to figure out how SharePoint 2010 populates that wonderful drop down list of AppPools, as seen in the screeshot just below, I noticed that all the classes it uses are marked internal sealed.

Since I wanted to learn more about how this control worked, I decided to use reflector to poke around. Here is the small amount of info I discovered of the items in the list: