Take taxi drivers. The prevailing wisdom is they will be replaced by Uber drivers, who in turn will ultimately be replaced by self-driving cars. Those lauding Transport for London’s refusal to renew Uber’s licence might like to consider how, long before that company “disrupted” the industry, turn-by-turn GPS route management and dispatch control systems were de-skilling taxi drivers: instead of building up navigational knowledge, they increasingly rely on satnavs.

If robots are the future of work, where do humans fit in? | Zoe Williams

Read more

Fears about humans becoming like machines go back longer than you might think. The sort of algorithmic management we see in the modern gig economy – in which drivers and riders for digital platforms such as Uber and Deliveroo are dispatched and managed not by human beings, but by sophisticated computer systems – has its roots in a management theory developed by Frederick Taylor in the early 20th century. As a young man, Taylor worked as a shop foreman for a steel-making corporation in Philadelphia, where he diagnosed inefficiencies he saw as being products of poorly structured incentives, unmotivated and sometimes shirking workers, and a huge knowledge gap that rendered management ineffective. Managers, he proclaimed, knew too little about the workforce, their tasks, capabilities and motivations.

Taylor and his disciples extolled the virtues of breaking down tasks into inputs, outputs, processes and procedures that can be mathematically analysed and transformed into recipes for efficient production. Over decades, and across different industries, his theories have been used to apply time-and-motion studies to workplaces, workers and what they produce. The assembly line is the most recognised example of Taylorism: unskilled workers engage in repetitive, mindless tasks, attending to semi-finished parts that, in the end, are combined into a whole product.

Over time, Taylorism became synonymous with the evils of extracting maximum value from workers while treating them as programmable cogs in machines. An early case in point: in 1917, at the height of wartime, approximately 100,000 Australian workers took part in a general strike. The action was sparked by the introduction of time cards, which recorded every minute spent at jobs and breaks. Today, it’s hard to think of time cards, even digital ones, as innovation. They have faded into the background of office life, business-as-usual for many workers. But back then they were seen as a new tool of oppression. Managers could use the information to learn how fast everyone worked and demand a quicker pace. This demeaning model was decried as “robotism”.

‘Instead of building up navigational knowledge, they increasingly rely on satnavs – but it’s not just taxi drivers being de-skilled.’ Photograph: Richard Gardner/Rex Features

Taylor’s approach jump-started debates about data-driven innovation and surveillance that continue today. The modern, digital version of Taylorism is more powerful than he could have ever imagined, and more dehumanising than his early critics could have predicted. Technological innovations have made it increasingly easy for managers to quickly and cheaply collect, process, evaluate and act upon massive amounts of information. In our age of big data, Taylorism has spread far beyond the factory floor. The algorithmic management of the gig economy is like time cards on steroids.

And it’s not just taxi drivers who are being de-skilled. The logistics and trucking industries utilise even more extensive and intensive data-driven systems that control fleets and employees. Employers utilise an array of sensors to track location, timing, driving and other aspects of performance. Complex algorithms, analytics software and other hidden components of management systems generate intelligence which is then used to instruct truck drivers. Cornell University professor Karen Levy has documented how these intense management systems reduce workers’ autonomy and can incentivise sleep deprivation and speeding.

Technology also allows much more sophisticated performance management of employees than during Taylor’s lifetime. Back then, employee reviews were costly in resource terms. They required face-to-face meetings or documents that took time to pull together. Today small businesses as well as giants such as Amazon are using digital tools to create continuous streams of data for employee appraisal. Constant monitoring, and the addition of peer review to supervisor feedback, can create overly competitive, and sometimes hostile, dynamics between employees.

It’s not just the intensity of the monitoring that is different. Surveillance is increasingly hidden. In Taylor’s analogue era, workers were acutely aware when they were being observed by management with stopwatches and notebooks. Today management tools are much less visible. A cashier at a fast-food franchise who rings up purchases with a virtual cash register app on her tablet might be unaware of the programs running surreptitiously in the background, logging keystrokes, recording audio or video, transmitting data and continuously rating performance. Workers who might know that their boss monitors calls, texts, and browsing on their employer-issued smartphones might be surprised to learn that the device also communicates geolocation data, allowing tracking of their movement 24/7.

France’s 'right to disconnect' banning workplace emails on weekends or holidays is a step in the right direction

The first line of defence against digital Taylorism is to resist its relentless creep within and outside the workplace. Taylor’s logic has become embedded in our everyday lives through our always-on digital environment. There is no easy solution to this. To find remedies, we’ll need to experiment with regulation and strengthened workers’ rights through institutions – such as unions – which have become weakened. France’s effort to promote the “right to disconnect” by banning workplaces with 50 or more employees from sending emails on weekends or holidays might not be perfect, but it is a step in the right direction.

We also need a shift in perspective. Taylorism starts from the assumption that employees are innate shirkers. While there will always be some who want to game the system and put in as little effort as possible, there are plenty who don’t. When the guiding assumption of management is that employees won’t be productive unless forced to be by constant observation, it engineers low morale and pushes people to act like resources that need to be micromanaged. Too often, we become what we’re expected to be.

On paper, making human beings behave like simple machines might deliver greater efficiency. But modern-day Taylorism threatens something that those kinds of market analyses fail to capture: the value of being human.

• Brett Frischmann is a professor in law, business and economics at Villanova University, Pennsylvania; Evan Selinger is a philosophy professor at Rochester Institute of Technology, New York state