I’ve shared office space with opera singers, sculptors, concert pianists, jewellery designers and a man fluent in seven languages, including Mandarin. None of the jobs I was in at the time was ‘creative’, and these specific skills may have been hard – languages aside – to call upon in company service, but recognizing and supporting ‘creativity’ in the workplace does seem to be a neglected issue. I first got reminded how much I dislike the work-oriented tendency to define ‘creativity’ as ‘problem-solving’ (with a heavy undertone of ‘cost saving’) a few weeks back by a post at one of our favourite blogs: HR Bartender.

Calling on a dictionary for moral support, Sharlyn Lauby reminded her readers that innovation is the introduction of a new idea, method etc., while creativity is the ability to produce through imaginative skills. Innovation may sometimes be brave, but in chronological terms innovation is the egg to creativity’s chicken. While the sceptics can offer their own ‘curate’ jokes at this point, let’s be clear: eggs don’t lay themselves. And in this particular metaphorical farmyard, you hire the chickens. (Whether you find “Q: Who came first? – A: The recruitment consultant” funny or not is your own affair, ok?)

We’ve previously discussed creativity and the difficulty some organisations have in not just managing it but allowing it to deliver its maximum potential in an earlier post, Tortoise-brain vs hare-brain: creativity at work. As we said at the time,

The commercial urge for speed contrasts with the creative need for analysis, thought, review and contemplation in the same way as the tortoise and the hare. But despite the common myths of the creative genius, they are not necessarily the ones being ‘hare-brained’.”

At least one observer thinks that the recent financial environment has had a further negative impact. Interviewed in the Globe and Mail in February this year, Linda Naiman, a creativity specialist and founder of Creativity at Work in Vancouver, commented that:

Managers are too busy putting out fires. They don’t have time to think about creativity. They’re in survival mode and may be afraid of the boss, afraid of failure, or afraid of taking risks. They become control freaks and wind up telling staff what to do and how to do it, and that leaves employees frustrated.”

From my experiences working in web development, I can think of many talented creatives who have moved on from roles where they were hired for their ideas and flair, and then lambasted for attempting to deploy them (even where these were individuals with a keen business sense and understanding that solutions had to be acceptable to customers.) And I can think of other friends who have left education (at school and higher education levels) in frustration at the tightening curriculum straitjacket and the tendency for workplace conversation to be about budgets, performance tables and financial targets rather than educating people – the very motivating reason for first applying for the job.

Quite apart from the impact on an organisation’s ability to achieve high quality creativity and innovation, a ‘no elbow room’ approach also impacts on employee engagement. And thereby on many aspects of current performance as well as the ability to remain competitive through innovation into the short- and mid-term future.

It’s a theme picked up by another observer in the UK this week. Aditya Chakrabortty wrote an article – Why our jobs are getting worse – in The Guardian’s Brainfood column that drew similar concluions:

More and more prized careers are becoming McDonaldised – more routine, less skilled, and with the workers subject to greater control from above. Take supermarkets. Jobs there could traditionally be split between the unskilled, low-paid drudgery of stacking shelves and sitting on tills – and the trained butchers and fishmongers and store managers. But when the sociologist Irena Grugulis and a team of researchers recently studied two of Britain’s largest supermarket chains, even the managers reported that they had little room for manoeuvre.

A trained butcher revealed that most meats were now sliced and packaged before they arrived in store; bakers in smaller shops now just reheated frozen loaves. […] Grugulis and her colleagues note that “almost every aspect of work for every kind of employee, from shopfloor worker . . . to the general store manager, was set out, standardised and occasionally scripted by the experts at head office”. Or, as one senior manager put it: “Every little thing is monitored so there is no place to hide.”

He also makes reference to some of the work of Prof Phil Brown at the University Of Warwick’s Institute for Employment Research, and in particular his concept of ‘Digital Taylorism’ – the translation of individual knowledge into defined processes – the ‘head office scripts’ that Irena Grugulis talked about. Here’s an extract from Education, globalisation and the knowledge economy (downloadable as a PDF), written for and published by the Economic and Social Research Council’s Teaching & Learning Research Programme.

This part of our analysis suggests that if the twentieth century brought mechanical Taylorism, characterised by the Fordist production line, where the knowledge of craft workers was captured, codified and re-engineered in the shape of the moving assembly line by management, the twenty-first century is the age of digital Taylorism.

This involves translating knowledge work into working knowledge through the extraction, codification and digitalisation of knowledge into software prescripts and packages that can be transmitted and manipulated by others regardless of location.”

Fordism, of course, sprang from Henry Ford, US car mogul. If you’ve read our earlier post A tale of one city: Leadership and legacies and are aware of the fate of Fordlandia, you’d be forgiven for wondering whether many organisations may be heading for a similar fate. You’d also be forgiven for asking our favourite question as to how we’ve found ourselves facing this potential dilemma: why?

One hunch might take us back to another earlier debate: HR’s role at the top table. We rhetorically asked what happens if HR functions’ voice at that table is that of the Finance Director, and the writers above provide a possible answer. While Brown and his colleagues acknowledge that Digital Taylorism doesn’t eliminate a need for a level of employee motivation – not least to provide a cheerful face for customer service and to minimize the costs of employee turnover, a possible future that is highly skilled, lowly waged and provides increasingly little room or scope for the employee voice – let alone the employee contribution, doesn’t seem hugely optimistic.

We once previously and cheekily asked if HR shouldn’t make themselves redundant, devolving their role to the line managers whose direct impact is greater. Might we not also ask the same rhetorical question of IT Departments – who are employing the Shirkey Principle to argue for the implementation of workflow and monitoring systems to ensure technology’s place at the top table on the grounds that things must be measured before they can be managed (and providing a line of new programmes to measure ever-increasing aspects of the workplace, whether or not the value of the particular metrics has yet been proven)? And of Finance Directors. Being churlish for a moment, unless a company sells financial services and their time is charged out to customers, FDs represent a pure cost: their input comes entirely as a cost to the business. Furthermore, they often serve to reduce the productivity of staff by demanding their input on financial monitoring and record keeping.

As we concluded before, the HR role is surely to argue for – and demonstrate – the critical importance of human capital. The human capital required to be creative, to exercise a little elbow room to move beyond the script where doing so recognises the customer’s requirements and closes the sale. To change the script where its impact has obviously hindered performance and morale. To recognise where a little latitude can help, and to acknowledge that processes and scripts don’t deliver performance – people do.

HR’s input can work to boost productivity or output of all staff, raising company profit all round. Apart from time spent on training – which, let’s face it, is usually far less than that spent on timesheet-filling, expense-claiming or supporting financial audits– their input should all be potentially positive.