Today I released a NuGet package for Experience Manager and DD4T (.NET). It allows a developer to easily add the required MarkUp to his (View)Models to enable the inline editing features of the Experience Manager from SDL Tridion. Only use this package if you use the DD4T .NET framework!

Install the package using the package explorer:

Install-Package DD4T.XPM

The installer automatically adds 2 files to the root of your MVC WebApplication: SiteEdit_config.xml and RegionConfiguration.xml
It also updates the web.config in the ‘Views’ folder to use the DD4T.XPM.XpmWebViewPage as pageBaseType and includes
the DD4T.XPM.HtmlHelpers namespace. After installing the package it’s recommended to restart Visual Studio.

How to use

1) Decorate your Models with the XPM Attributes:

2) Create your model and call the ‘MakeInlineEditable’ method

3) Use the DD4T.XPM helpers to write out the value in the View

That’s all.

Regions

Regions are configured in the file ‘RegionConfiguration.xml’ in the root of your webapplication. This file is added by the NuGet installer. In your view you can use the following call to writeout the region MarkUp:

@XPM.RegionMarkup("PageHeader")

PageHeader is the ID of the region as configured in the RegionConfiguration.

Final notes

The NuGet installer adds the ‘SiteEdit_config.xml’ file to the root of your project. If this file is present, the XPM helper methods will write out the MarkUp (If you called ‘MakeInlineEditable’). If this file is not present, the helpers don’t output the MarkUp. Just the value. Of course you want to control the call to ‘MakeInlineEditable’ based on the environment you’re in: only call this in staging!

This package is developed with .NET 4.5.1 and NuGet version 2.7. I did *not* test it with other .NET frameworks, but I assume it just works.

While working on a Custom Resolver, I needed to grab some configuration values. This seems fairly straightforward, and the documentation from SDL Tridion covers this. It states that we have to add a ConfigurationSection to the ‘Tridion.ContentManager.config’ file and that we can read these values using the following code:

It’s unclear where ‘Config.GetConfig’ comes from, but there’s more. The SDL Tridion Content Manager uses different services to resolve items to be published. The following SDL Tridion services use your Custom Resolver:

– TcmServiceHost
– TcmPublisher

The TcmServiceHost calls the resolver when a user clicks on the ‘Show Items to publish’ button in the ‘Publish’ popup.
The TcmPublisher calls the resolver when the item is actually published.
Both services have their own executable and their own configuration: TcmServiceHost.exe.config and TcmPublisher.exe.config (Located in the %Tridion_Home%\bin directory)

So, after adding the configuration for our custom resolver to the Tridion.ContentManager.config file I hooked up the debugger to the TcmPublisher and clicked ‘Publish’: no configuration values were found. Which makes perfect sense, since the TcmPublisher.exe uses the TcmPublisher.exe.config as its configuration source. The same is true for the TcmServiceHost: it uses the TcmServiceHost.exe.config as its configuration source.

How to solve this configuration issue?

Well, luckily both config files have a reference to the ‘Tridion.ContentManager.config’ file: (All Tridion Content Manager executables/services have a reference to this config file)

The type of the ConfigurationSection is ‘AppSettingsSection’. This is different from the documentation, but that doesn’t matter.
You can insert whatever section you like, as long as you update the code to get the ConfigurationSection. (Cast it to the correct type)

While working on an Experience Manager implementation, you often find yourself (at least I do) in the position where you want to change/update the HTML that is generated to be able to edit the content nicely in Experience Manager.
XPM uses HTML-comments to ‘mark’ fields as being editable. The XPM JavaScript draws a border around such a field to highlight it, so the editor knows he can edit this field.
The JavaScript from XPM uses the nearest HTML container (<div>, <h1>, etc) to draw this border.

But often enough the HTML doesn’t fit XPM. The border is drawn to big, too small or doesn’t show up at all because there is no ‘fitting’ HTML element. Or your property doesn’t have a visual representation. Think for instance video parameters, or metadata. Ideally you want your editors to be able to edit these properties in the Experience Manager view.

The most ideal scenario is that the developer somehow can detect if the current page/dcp is rendered in the XPM editor view (and NOT just in the ‘normal’ staging website). Knowing that the current page is rendered in the XPM editor view, allows the developer to add additional HTML to create a more customized user experience for the editor.

There are several solutions to detect the state (in XPM editor view) clientside (using JavaScript), but I haven’t seen a solution for detecting it serverside. And because I needed it in my current project and saw that several people asked for it, I decided to try to build a solution and ‘put it out there’.

This is how it works in a nutshell

XPM loads the staging website in an <iframe>

A GUI extension (1 JavaScript with very few lines of code) is loaded

This JavaScript contains a method that runs just before the page is loaded into the iframe

It updates the <iframe> ‘src’ attribute and adds a querystring parameter calld ‘ActiveInXpm=true’

Another line of code takes care of the ‘Exit’ button functionality (makes sure that you are redirected back to the original URL with a querystring parameter ‘ExitXpmEditor=true’)

I know it’s a *very* simple extension, but for me it does the job. In my (DD4T .NET ) website I check this querystring parameter and that’s how I know that the current page is loaded in the XPM editor view. (Getting the Querystring parameter is a trivial task in every serverside language).

While this works for the first page that is opened in the XPM Editor, it doesn’t work when the editor navigates to the following url while staying in the Experience Manager. This all happens within the <iframe> and I couldn’t find a reliable way to add the querystring parameter to each request. If you do, please let me know 🙂

I solved this by setting the ‘ActiveInXpm’ querystring parameter value in the session. And my code to check if I am in the Editor view, checks the session. If the user exits the XPM editor by clicking on the ‘Exit’ button, the session is emptied and the page looks exactly how it would look like on the live site. The session handling functionality is only added to the staging website, so the live website *never* has to deal with XPM stuff.

I build this for a DD4T .NET site. If you want the code for handling the session, drop me an email and I’ll send it to you.

DD4T stands for Dynamic Delivery For Tridion and is a leightweight ASP.NET MVC framework build on top of the SDL Tridion stack. It’s opensource and you can find more about it here

XPM is the WYSIWYG editor (and much more!) that ships with SDL Tridion.

Domain Driven Development is… well, Wikipedia explains it better than I can, so check it out!

I am a big fan of the MVC framework from Microsoft. No wonder I also love the DD4T framework as it makes building MVC websites with SDL Tridion a LOT easier.
One of the shining features of SDL Tridion is its recently upgraded WYSIWYG editor (Or Experience Manager) that allows editors to edit the content of the website in the context of the website itself, in the browser.
This is a great feature and makes it very easy to adjust content in a natural way.

Of course, before content editors can use the Experience Manager (XPM from now on) the SDL Tridion consultant has to pull some triggers to make this possible. All relatively easy to do.

But with DD4T it’s not so straight-forward as one would want. And especially if you are doing (a form of) Domain Driven Development, and thus are using (domain)ViewModels.

Before you read on, I highly recommend that you read Kah Tang’s article on ViewModels in DD4T first. This is how I usually implement DD4T and helps you understand the problem we are trying to solve in this post.

For example, consider the following razor View (which also renders the XPM markup):

As you can see this View uses the DD4T ‘IComponent’ as it’s ViewModel. And it is using the OOTB DD4T ‘SiteEditField’ HtmlHelper to generate the XPM markup.
While using the IComponent as a ViewModel is valid, it’s not as nice and clean as it could be. Also, the developer has to know the name of the field in Tridion and there’s no compile time checking. While it’s a valid approach, it doesn’t leverage all the advantages of the MVC framework (for example: no strongly typed views).

I always use (domain) specific ViewModel’s. Using your own, domain specific ViewModels has many advantages from which ‘separation of concerns’ and intellisense are just two of them (IMHO).
ViewModels only purpose is to display the Model in a certain way. Sometimes they are (almost) identical to your Domain Model, but a ViewModel can/must have extra properties to make displaying it possible. It results in much clear views.

Consider the following example (not rendering the XPM markup):

As you can see it’s much cleaner and easier to read then the previous View and there is also no logic (checking, etc) involved. (I also could have used an HtmlHelper to write out the tag. It’s up to you.)

But what if your customer asked you to implement XPM? You cannot use the OOTB Html helper since your ViewModel doesn’t have the properties this helper expects.
Well, there are a few options.

1. Add the ‘IComponent’ as a complex property to your ViewModel.
This would look something like this:

It still requires the developer to know the names in Tridion and it’s lacking the strongly typed advantages. (Intellisense)

2. Add the XMP MarkUp as separate properties to your ViewModel:

This already looks cleaner and has intellisense. But I don’t like the added properties to the ViewModel. It clutters the ViewModel.

I struggled with this issue for quite some time. But after trying the above mentioned approaches I wasn’t happy with the result. Although it works, it isn’t as nice, clean and intuitive (for a programmer) as it could be.

After spending much time on it I came up with an approach. This approach involved quite some coding, but it’s for a good cause right? And I liked doing it, because I’ve learned a lot of new stuff.

I wanted it to be a generic solution, so everyone using DD4T could use it. This is how it looks like:
(It’s still a work in progress!)

1. Create your ViewModel and decorate it with attributes.
Example:

As you can see there are 2 new Attributes involved.

InlineEditable

InlineEditableField

The first attribute ‘InlineEditable’ marks the class (ViewModel) as inline-editable with XPM. The second attribute marks a single field as inline-editable with XPM.

Of course this is not everything. Once you created your ViewModel, you have to make a call to a method to do the magic. Since all my ViewModel’s by default are created inside a ModelFactory (a subject for a different post), I made this functionality part of a base-class, but you can implement it any way.
This is how my (simplified) ViewModel builder looks like:

That’s all. The article is now ready for XPM. It’s not yet inline editable, but the information from Tridion is added to the class, so we can use it in our View.

Let see how our View would look like if we would make this ViewModel (inline)editable with XPM:

As you can see we have full intellisense and a nice and clean View. Of course, I simplified the example a little, but it proves a point.

The XPM Helper and it’s ‘Editable’ method write out the actual value of the property and its corresponding XPM MarkUp.
There’s also a ‘MarkUp’ method that just write’s out the XPM MarkUp. This becomes handy when you want to make an image or hyperlink inline-editable:

All this is not yet part of DD4T, but I am planning on integrating it in the framework, as I see it as a valuable addition. (If not: let me know).
It encourages the use of (domain)ViewModel’s and results in a cleaner solution.

If you want to the XPM helper in you project now, drop me an email and I will send you the source-code and the instructions on how to set it up. In the end there’s really not much to it, but isn’t that the case with all challenges?

In the past week I had the opportunity to install the Experience Manager with Session Preview on a completelyDD4T and SDL Tridion driven website. Configuring the Experience Manager can be quite painful. Especially if you don’t know how Session Preview (exactly) works and if you have no clue where to start and where to look.

In this post I want to give you some hooks and pointers on where to look if things get interesting 🙂
In fact, if you are DESPERATE about why your Session Preview isn’t working, this post is aimed at you!

But first: thanks to Andrew Marchuk, Daniel and Likhan from SDL Tridion for helping me. Without their help I would still be staring at my screen 🙂

First, turn of caching for your website. Just to be sure. After you got Session Preview working, turn on caching again and see what happens. But for troubleshooting the Session Preview I recommend to turn of caching completely. Just to be sure…

1. Do a basic sanity check and check the following for your staging website:

– Open the cd_storage_conf.xml from your staging website and ensure that:

Again: this module is NOT, I repeat NOT necessary if your website is a completely dynamic website. (e.g. retrieves everything from the broker like DD4T). If you still use this module, you will see that clicking on ‘Update Preview’ will generate files on the filesystem! And it will not show you the updated preview!

Open the cd_ambient_conf.xml file of your Staging website and check if the following Cartridge is referenced:

<Cartridge File="cd_webservice_preview_cartridge.xml"/>

2. Check the following for your OData Webservice: (The one that is used by the Session Preview, so the one you configured as the ‘Content Delivery Endpoint Url’ on your Publication Target)

Copy/paste this ‘Content Delivery Endpoint Url’ and paste it into your browser. (Of course inside the company domain…) and see if it responds.

The url looks like this: http://localhost:73/odata.svc/
You should get a response with a listing of all collections that can be retrieved by this OData endpoint. Something along the lines of this:

Now, do an IISReset on your website and for your OData webservice. This makes sure that your changes to the various configuration files are used when the Content Delivery Instance boots up the first time. DO NOT SKIP THIS STEP! In case you are still in doubt: DO NOT SKIP THIS STEP! (Sorry for shouting)

Now, hit ‘Update Preview’ again. If it is still not working for whatever reason keep reading:

1. Open the logback.xml file of your Staging website, and set the loglevel to ‘VERBOSE’.
2. Open the logback.xml file of your OData webservice and set the loglevel to ‘VERBOSE’.
3. Clear both logfiles! (So you have a fresh start)
4. Clear the ‘Tridion’, ‘Tridion Content Manager’ and ‘Application’ Windows Eventlogs on the Content Manager Server
5. Clear the ‘Application’ Windows Eventlog on the Staging WebSite server
6. Clear the ‘Application’ Windows Eventlog on the Odata webservice server

Do an IISReset (You edited the logback.xml file, so this is necessary!)

Now, hit ‘Update Preview’ again and check out the logfiles in this order:

cd_core.log of your Staging website
-> Anything unusual? Especially error’s and warnings with regard to the Ambient Data Framework are important! Take them seriously and double check the cd_ambient_conf.xml and the cd_storage_conf.xml of your staging Website. Also, check if all HttpModules and/filters are present in the Web.config of your website! (See above)

cd_core.log of your OData website. If this file is (almost) empty that means that the ‘Update Preview’ request NEVER reached the OData webservice. This could be due to:
– Network issues: are the IIS Bindings of the OData webservice correct?
– Can you connect to the OData webservice using your browser?
– Is your publication target pointing to the correct Content Delivery Endpoint Url (your OData webservice)?

If there is data in the cd_core.log of your OData webservice, check to see if there are error’s or unusual statements.

If you search for your adjusted content do you see it? If so, this means that your changed content is correctly send to the OData webservice. If not, that means that your staging website cannot connect to the OData webservice. Again: Check IIS settings and network settings.

Open the Session Preview Database using SQL Server Management Studio, and open the table ‘Component Presentations’. After you hit ‘Update Preview’, you SHOULD see something added to this table. If not: check if you referenced the correct Session Preview Broker Database in BOTH of your Wrappers. (In the cd_storage_conf.xml of your Website and in the cd_storage_conf.xml of your OData webservice!)

If you see an HTTP error 400 BAD REQUEST after you click on ‘Update Preview’ check the following:

Open the Windows EventLog ‘Tridion Content Manager’ on the Content Manager server and check if you see the same error here.

If so, try the following:

Stop the TcmServiceHost windows service on the Content Manager Server. (Be careful, The SDL Tridion Content Manager stops working now!)
Next, browse to the SDL Tridion install directory\bin with the command prompt and start the TcmServiceHost.exe with the -debug command. Like this:

TcmServiceHost.exe -debug

Now, open Fiddler on the Content Manager server, apply a filter to show only traffic from the TcmServiceHost and hit ‘Update Preview’ again. Now you have the request and you can inspect it to see if there’s anything unusual. E.g. the Content-Lenght is 0. That’s weird, because that means no data was send!

The last resort consist of tracing everything related to the OData webservice. If everything above failed, do the following:

Open the Web.config of the OData webservice and add the following code:

Note that ‘maxReceivedMessageSize’ and ‘maxBufferSize’ should be the same!

Don’t forget to remove this settings once you resolved all your issues!

Phew! I *really* hope your Session Preview service is now working properly.
If this isn’t the case consider asking it on StackOverflow (and while you’re there, consider committing to the SDL Tridion Exchange Proposal)
The community is really helpful and very knowledgeable. Of course you can also open a support ticket with Customer Support.

In a website build with DD4T (almost) all content comes from the Broker Database. The content is stored as an XML string in the database and is transformed (de-serialized) into .NET objects at request time. As you can image this has a huge impact on the performance of your website: on every request the XML is loaded (streamed) from the database and the DD4T framework de-serializes this into usable .NET objects.
This is a time-consuming process and puts a heavy load on your webserver.

Luckily there are a few options/strategies to improve the performance of your website. And the beauty about these options is that they (almost) come for free!

Output Caching

The first option is the out-of-the-box Output caching from ASP.NET.
Just decorate your controller (PageController, ComponentController; your choice) with the OutputCache attribute and your done!

OutputCache caches the output (…) for the duration you configured in the web.config. If the cache duration is set to 5 minutes, and in these 5 minutes you publish a page, the changes are NOT reflected in your browser if you hit F5. Only after 5 minutes the cache is invalided and on the next request the XML is loaded from the Broker Database. And de-serialized.

DD4T Caching

Luckily for us DD4T ships with a build in cache mechanism. This caching-mechanism is build on top of the .NET System Runtime Cache and can be used in conjunction with the Output Cache.

DD4T caching stores de-serialized objects like Components and Pages in the .NET Runtime cache after they are requested for the first time. Every consecutive request for that page/component loads it from the Object cache instead of loading it from the Broker database and de-serializing it into a .NET objects.
As you can imagine, this causes a massive performance improvement.

But how does DD4T ‘know’ when to invalidate the item in the cache? Because if you re-publish a page or component, you want your website to show the updated page/component.
The fact is that DD4T never knows when an item is republished, unless it ‘asks’ SDL Tridion for it. (Due to the fact that a website is stateless)
Well, this ‘asking’ is implemented in DD4T.

DD4T poll’s every x seconds/minutes/hours/etc (configurable) if the LastPublishDate from an item in the cache has changed. If it has changed (the item was republished) it will invalidate this item. The next time this item is requested, it will be loaded from the Broker databases, de-serialized and stored in the cache.

To configure how often DD4T needs to check the LastPublishDate of the items in the cache, use this setting in your web.config (value must be in seconds)

<add key="CacheSettings_CallBackInterval" value="30"/>

In this example, DD4T poll’s the Broker Database every 30 seconds to check if the items in the cache are still valid.

Also, after a configurable amount of time, the item is -no matter what- invalidated. The amount of time can be configured separately for pages and components.
Use the following configuration settings to accomplish this:

In this example all the pages and components in the cache are invalidated after 1 hour.

SDL Tridion Object cache

SDL Tridion comes with a caching solution called ‘Object cache’. To quote the documentation, this is what it does:

To improve the performance of your Web site, you can choose to store the most commonly used or resource-intensive objects from your Content Data Store in a cache. The cache keeps these objects available for the applications that request them, rather than reinitializing them each time they are requested.

Pretty obvious right? So no need to explain it further.
Read the documentation here. (Login required)

A while ago I was struggling with the above mentioned challenge: I wanted to load some JavaScript into the Tridion Content Manager GUI, but without showing a corresponding GUI element (Button, list, etc).
I searched the online documentation portal, the good old forum, searched all the Tridion blogs, but could not find it.
With no other option left, I turned to the experts. Since not too long, they can also be found here.
(And while you’re there, why not join us?)

It took precisely 3 minutes and I had my answer. Since I could not find it, I assume you also cannot find it. That’s why I share it here.
But not without mentioning the one who gave me the answer: Frank. Thanks.

For adding a JavaScript to extend the Tridion Content Manager GUI, but without showing a button or list or whatever, the following configuration is needed:

This piece of configuration makes sure that your javascript file is loaded for the complete GUI. This might not be what you want.
Let’s say you only want to load your methods/classes for a certain view, let say the Component-edit screen. You can achieve this by adding the following code to your
JavaScript file: