UX Insider – 5 Tech Tips for Session Recording

Customer expectations for their digital experiences are higher than ever. Imagine you are charged with reducing the number of technical support cases and user-related errors occurring on your company’s online business. Where do you even begin? Given the layout and complexity of any transactional website, the logical baseline for anyone seeking to collect this type of information would be to start monitoring user experience. The ability to collect data by capturing the entire user session of every visitor to the website is an essential component to digital experience management (DEM).

Weighing Your Options

While there are several tools in the market which provide this functionality, I’d like to share 5 technical tips or considerations for anyone looking to augment their DEM strategy through user session recording.

1. 360-degree user experience Capture

The biggest requirement by far is the ability for your organization to capture 100% of every online user experience; across devices, browsers, single-page apps and mobile native apps — all digital channels. Missing ONE user session exposes your department or organization to risk.

Modern websites make use of modern browser technology. Make sure the capturing tool can deal with the new technologies and has a short update cycle to support your business. For native apps, make sure you instrument session recording into the app.

2. Ease of use

Whilst all the set-up, configuration and indexing may happen in the background, on the server-side or in the cloud, the actual search, retrieval and playback must be easy to use for all departments; especially for the less technical users. Ultimately, this will allow users to quickly identify and find the exact moment when some issue has occurred.

See what your user has seen. Fast forward to relevant positions, correlate to events, capture and reproduce for your tech or UX department so they can fix it.

3. Ease of Deployment and Update

Most forward-looking IT organizations are vested in digital transformation and modernization, ensuring the tools they purchase today will be compatible with any over-arching architectural strategy. The tool you select for session recording and replay should be easy to install — and I’m talking minutes, not days — as well as easy to deploy across your application landscape. The tool must also adapt to new features and any changes in the environment automatically just as easily with minimal amount of configuration changes. This is especially important for very large cloud-based online businesses, where manual configuration for thousands of servers and services is impossible.

4. Light Weight

If you are concerned about UX, you must be also concerned about page-load times for your application. It doesn’t make sense that the tool you select to monitor it adds seconds to load times because of its recording. The tool must not impact your application’s performance and processing speeds visibly. The same can be said about the collected data volume: storage is not a cheap commodity, so why would you want to maintain and pay for large volumes of data? Make sure the tool of your choice keeps data volumes at a minimum and keeps only the relevant data for you.

5. Secure

Whatever your business, when dealing with user data such as credit cards, addresses or even something as basic as username and password, you need to ensure that this information is ALWAYS secure through every step of the process. For session replay this means that sensitive data needs to be protected at data capture, index, storage, and replay. Optimally, the tool is able to exclude sensitive data from recording altogether. Ensure the tool has the ability to obscure sensitive information as needed, and that data encryption is involved throughout. In order to comply with regulations, any personal data must be identifiable and purgeable at a customer’s request. Make sure the tool of your choice is adhering to those requirements.

Based on your organization’s architectural roadmap (i.e. “moving to the cloud”, virtualization etc) it would be very easy to add more points to this list. If you have any particular questions, thoughts or considerations of your own that you would add here, please join the conversation by commenting below.

Peter has a PhD in theoretical physics and has worked on large scale data processing projects his entire career, including the Sloan Digital Sky Survey and the CERN Large Hadron Collider, and later many projects in life and health sciences.
Peter always tells anyone who is willing to listen that whoever has the best data, gets the most interesting insights from it.
As dev lead at the Dynatrace Barcelona lab he has a lot of fun building the session replay, usability and user experience analytics components, making use of the really cool data that is available through dynatrace.