Missing persons

April 6, 2018

The age of radical digital personalisation is here, and it’s changing the way we consume. But don’t bet your house on it.

In days gone by, the best you could hope for in the way of content personalisation were productions targeted at your “demographic”. But out of the hundreds of possibilities explicitly designed for people who tick your demographical boxes, you’d have been hard-pressed to find more than a few movies, TV shows or articles that you truly enjoyed. You might even have categorically hated pretty much everything that purported to be for “people like you”—whatever that meant.

Today, technology has flipped the paradigm.

Instead of matching individual pieces of content to swathes of people, the algorithms that drive our digital experiences match individual people to swathes of content.

And the more of our data they have, the more effective they are. Everything from your basic personal details to your Likes and even your writing style can be used to predict the content you’ll appreciate to a level of accuracy your own parent, spouse or child could never hope to match. Log onto a social network, download a news aggregator or sign up to a streaming service and you can be sure that, shortly thereafter, a menu of content you really are likely to enjoy will simply be presented to you.

What do we really want?

So far, so good. But we’re complex creatures, with desires that are often in tension or even outright conflict with each another. It’s easy to unlock our smartphones and ride those algorithms on an effortless voyage through the digiverse that feels enjoyable from moment to moment. But after an hour, or two, or three, we look up from our screens and realise what we’ve been doing and for how long we’ve been doing it. And we reflect. At a deep level, is this really how we would have chosen to spend that time had we stepped back at any point to consider it? Algorithms make it easy for us to meet some of our goals. But certainly not all of them. And arguably not the most valuable ones. Echo chambers, growing polarisation, dumbing down, the demise of nuance, the rise of outrage, the proliferation of fake news. All of these are, at least in part, unintended and undesirable consequences of an over-reliance on algorithms.

These lines of code are well suited to satisfying our shallowest desires. But when it comes to our deepest values, they are entirely unresponsive.

Imagine an alternate universe in which, whenever we unlock our smartphone in search of content, a Personal Content Advisor (or PCA) appears on screen and talks to us about what we’d like to see. Based on that conversation, our PCA then presents us with a personalised menu of content. In one respect, this would be torturous. Where an algorithm can get to know our tastes extremely quickly and automatically, our PCA would need us to spend a long time in conversation before they could get anywhere near the same level of accuracy. But through such exchanges, our PCA would be able to uncover and respond to far deeper values than an algorithm ever could. An algorithm might analyse our personal and behavioural data and see that we enjoy reading listicles. But it would take reflective conversation to reveal that we don’t like that we like listicles. In our most sober moments, we may recognise that although we find it comforting to skip through a neat list of five delicious things to do with cheese, greater riches lie in more thoughtful, demanding reads. While it’s reassuring to only read articles from people who agree with us, we know that we can’t learn unless we expose what we believe to challenge from new ideas and alternate viewpoints. And juicy as it is to read about who’s dating whom, we know that we need to engage with news of greater consequence if we want to be better citizens. In light of such insights, we might then ask our PCA to save us from ourselves. The result—a menu of creations that contains fewer pacifiers and more opportunities for personal growth.

Sign up for our newsletter

Email

Please enter a valid email address.

Title

Please select your title.

Last Name

Please enter your lastname.

First Name

Please enter your firstname.

Country of residence

Please select your country.

I'm not an US citizen*

This field is mandatory

Something happened, message not sent.

Handle with care

There is much talk of using psychometric data-driven algorithms in the financial sector. Such algorithms could be used to understand our appetite for risk, the sorts of investment that excite us and the kinds of product we gravitate towards, to make so-called personalised recommendations and design tailored investment strategies for individual clients. Such a model would almost certainly increase sales, as well as giving clients the feeling that their bank, rather than just their financial advisor, knows them better than ever.

But when it comes to designing long-term investment strategies, it’s crucial that they’re geared towards meeting our most enduring and considered goals rather than just our ephemeral desires.

As their application in the field of digital content has shown, to rely on algorithms in finance is to risk mistaking consumption for satisfaction.

Nevertheless, data-driven algorithms have the potential to change financial services for the better. No other tool can tell us so much about a person so quickly, and those insights could vastly improve the overall quality of financial advice. But it’s crucial that we maintain the human touch.

Suppose our alternate universe contained algorithms alongside PCAs. In this universe, our PCAs could use data-driven insights to inform, but not define, their recommendations. They could pay close attention to these insights, while always recognising that we are more than just the sum of our data. So, our recommendations would account for our tastes far more quickly and accurately than a PCA could alone, while continuing to strike a balance between content that entertains us and creations that add real value to our lives.

The value of humanity

Fortunately, we don’t need to travel to another universe to access a financial advisor. And it’s important it stays that way. As algorithms become ever more sophisticated, some banks may be tempted by cost savings and short-term profits to replace financial advisors altogether. In this scenario, an algorithm would review all of the available options and curate the few it thinks the client is most likely to choose based on their psychometric profile. But there’s a potential further step. Banks might even claim that the technology is so sophisticated that the client need make no active decisions at all. Apart from one—the decision to allow algorithms to adopt their psychometric profile and invest on their behalf as if they were a combination of client and financial advisor.

They do so at their peril.

Algorithms will add enormous value, but only if their insights are mediated by a person who is engaged with their clients in all of their complex individuality.

Otherwise, the result will be a step backwards—shallow financial advice that might increase sales for banks in the short term, but which fails to honour their clients’ deepest values and needs or effectively mitigate risk. For instance, an algorithm might correctly see that a client gravitates towards risky investments that provide high returns in a short period of time. But it takes a fellow human to surface goals that, by virtue of their depth, are less salient—preparing for a distant retirement, perhaps, or building a nest egg for a child not yet born.

At Lombard Odier, we value long-term relationships over short-term profits. That philosophy has been at the heart of our business since our founding and has allowed us to grow with our clients for over 200 years. So while we will always look to the latest technology to improve the advice we offer our clients, we will not rely on it at the expense of human connection. For it is this connection that allows us to work with our clients to design investment strategies that align with their deepest values. And that is our deepest value.

Important information

This document is issued by Bank Lombard Odier & Co Ltd or an entity of the Group (hereinafter “Lombard Odier”). It is not intended for distribution, publication, or use in any jurisdiction where such distribution, publication, or use would be unlawful, nor is it aimed at any person or entity to whom it would be unlawful to address such a document. This document was not prepared by the Financial Research Department of Lombard Odier.