Monday, November 06, 2017

63% of children entering primary school today will be doing jobs that don’t yet exist & 47% of jobs will be automated... oh yeah...10 reasons why this is bullshit

63% of children entering primary school today
will be doing jobs that don’t yet exist. Tired of seeing this at conferences.... I read
and hear it often. The last was from two speakers on the same platform, an
'Internationally Recognized Expert on the Future of Work and the Future of
Learning' and a senior bod at LinkedIn, at OEB in Berlin. Let's be clear, this
quote is made up - there is no source - it's the touchstone of the charlatan.

47% of jobs will be automated... I’ve lost count of the times I’ve seen this mentioned in
newspapers, articles and conference slides. It is from a 2013 paper by Frey and
Osborne. First, it refers only to the US, and only states that such jobs
are under threat. Dig a little deeper and you find that it is a rather
speculative piece of work. AI is an ‘idiot savant’, very smart on specific
tasks but very stupid and prone to massive error when it goes beyond its narrow
domain. This paper errs on the idiot side.

They looked at 702 job types then, interestingly, used AI
itself (machine learning) which they trained with 70 jobs, judged by humans as
being at risk of automation or not. They then trained a ‘classifier’ or software program with this data, to
predict the probability of the other 632 jobs in being automated. You can
already see the weaknesses. First the human trained data set – get this wrong
and it sweeps through the much larger AI generated conclusions. Second, the
classifier, even if it is out by a little can make wildly wrong conclusions. The
study itself, largely automated by AI, rather than being a credible forecast, is more useful as a study of what can go
wrong in AI. Many other similar reports company in the market parrot these results. To be fair, some
are more fine-grained than the Frey and Osborne paper but most suffer from the
same basic flaws.

Flaw 1: Human fears
trumps tech

The great flaw is over-egging the headline. The fact that
47% of jobs may be automated makes a great headline but is a lousy piece of
analysis. Change does not happen this way. In many jobs the context
or culture means that complete automation will not happen quickly. There are
human fears and expectations that demand the presence of humans in the
workplace. We can automate cars, even airplanes, but it will be a long time
before airplanes will fly across the Atlantic with several hundred passengers
and no pilot. There are human perceptions
that, even if irrational, have to be overcome. We may have automated waiters
that trolley food to your table but the expectation that a real person will
deliver the food and engage with you is all too real.

Flaw 2: Institutional
inertia trumps tech

Organisations grow around people and are run by people.
These people build systems, processes, budget plans and funding processes that
do not necessarily quickly lead to productivity gains through automation. They often protect
people, products and processes that put a brake on automation. Most
organisations have an ecosystem that makes change difficult – poor forecasting,
no room for innovation, arcane procurement and sclerotic regulations. This all militates
against innovative change. Even when faced with something that saves a huge
amount of time and cost, there is a tendency to stick to existing practice.

As Upton Sinclair said, “It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

Flaw 3: Low labour costs

What is often forgotten in such analyses is the business
case and labour supply context. Automation will not happen where the investment
cost is higher than hiring human labour, and is less likely to occur where
labour supply is high and wages low. We have seen this recently, in countries
such as the UK, where the low-cost labour supply through immigration has been
high, making the business case for innovation and automation low. Many jobs could
be automates but the lack of investment money, availability of cheap labour and
low wages makes the human bar quite low. There are complex economic decision chains at work here that slow down automation.

Flaw 4: Hyperbole
around Robots

Another flaw is the hyperbole around ‘robots'. Most AI does
not need to be embedded in a humanoid form. Self-driving cars do not need robot
drivers, vacuum cleaners do not need humanoid robots pushing them around. Most
AI is invisible, online or embedded online or in the electronics of a device. As Toby
Walsh rightly says, when he eviscerates certain parts of the Frey and Osborne
report, there’s no way robots will be cutting your hair or serving your food by
weaving through busy restaurants with several plates of food, any time soon.

The ‘Reductive Robot Fallacy’, is the
anthropomorphic tendency to equate AI with robots along with the idea that
robot technology has to look like us and do things the way humans do them. The
vast majority of robots, AI-driven machines that perform a useful function, do
not look like humans, many are online and almost invisible.

Flaw 5: Hyperbole
around AI

AI is an idiot savant. It is incredibly smart at specific
things in specific domains but profoundly stupid at flexible and general tasks.
This is why entire jobs are rarely eliminated through automation, except for very narrow, routine jobs, like warehouse picking and packaging, spray painting
a car and so on. Accountants use spreadsheets, restaurants use dishwashers,
mixers and microwaves. Most automation is partial, as the general worker still
outfoxes AI. There are severe limitation to AI in many fields, not least the sheer amount of processing power needed to fuel the applications as well as limitations in the maths itself. There is also a great deal of hype around the 'cognitive' capabilities of AI, led I suspect by that misleading word 'intelligence'. As Roger Schank says, it's only 'software'. AI is not conscious and has little in the way of cognitive skills. So let's calm down on 'cognitice computing' - it's not. It may win at GO but it doesn't know it has won.

Flaw 6: Garbage-in,
garbage-out

This common flaw, as Walsh rightly spotted, was that the training
data in the Frey and Osborne paper was either a 0 or 1 probability of
automation but the outputs were between 0 and 1. This is an example, not so
much as garbage in-garbage out, as binary-in range-out. You can see this
manifest itself in some absurd predictions around jobs that are unlikely to be
automated, as well as underestimates in others, like hairdressing, waiters and
cleaners. Beware of AI generated predictions.

Flaw 7: Heuristics

The process of automation in employment is a messy business
with many variables. Heuristics can help here. First we can categorise jobs
first as: Cognitive v Manual; then… Cognitive routine, Manual routine, Cognitive
non-routine, Manual non-routine. But even the distinction between manual and
cognitive is not mutually exclusive. Few manual jobs require no knowledge,
planning or problem solving. These can be useful rules of thumb but the world
rarely falls neatly into these binary of four-way categories. Yet they often
lie at the heart of predictive analysis. Beware of simplistic heuristics.

Flaw 8: Human bias

Bias in analysis is all too common. Take just one example; the analysis of
education. The people doing the analysis are often academics
or people who have an academic bent. The Frey and Osborne paper conflates
education into one group, as if kindergarden work was the same as academic
research. The routine aspects of education, the fact that most teachers,
trainers and lecturers do a lot of admin and work that is actually routine and
repetitive, is conveniently ignored. Google, Wikipedia and online management
and learning has already eaten into the employment of librarians and teachers.
It is a displacement industry. Take one service – Google. As the task of
finding things became super-fast, the process of learning, research and
teaching became quicker. Library footfall falls, as we no longer have to troop
off to the library to get the information. Amazon has commoditised the purchase
of books. Commoditisation is what technology is good at and what Marx
recognised as a driving force in market economies. Educators don’t like to hear
this but they have a lot to gain here. Teaching is a means to and end not an
end in itself. It has and will continue to be automated, not by robots but by
smart, personalised, online learning.

Flaw 9: Activities
get automated, not jobs

In truth most jobs will be partially automated. This has
been going on for centuries with technological advances. Sure, horse grooms and
carriage drivers no longer exist but car mechanics and taxi drivers do.
Typesetters have been replaced by web designers. ATMs have simply changed the
nature of bank tellers, not completely automated the process. Indeed, in many
professions the shift has been towards more customer service and less
mechanical service. What matters is not necessary the crude measure of ‘jobs’
being automated but rather activities’ being automated. By activities, we mean
specific tasks, competences and skills.

Flaw 10: New jobs

“65% of
today’s students will be employed in jobs that don’t exist yet.” This is the
sort of exaggeration that feeds bad consultancy. Most will be doing jobs that
have existed for some time. Many will simply be doing jobs they didn’t plan on
doing (and don’t like) or jobs that have changed somewhat through automation. Predicting
which jobs or activities get automated is easy compared to predicting what new
jobs will be created. The net total is therefore difficult to establish. Fewer
people may be needed in certain areas but new jobs will be created, especially
in services.

Conclusion

To be fair, more recent analyses have moved on to more fine
grained concepts and data. McKinsey did a detailed analysis of 2,000-plus work activities in 800 occupations, with
data from the US Bureau of Labor Statistics and O*Net. They quantified the time
spent on these activities and the technical feasibility of automating them. NESTA
dis a breakdown of specific skills.

The crude headlines will continue but we’re starting to see more
detailed and realistic analysis that will lead to better predictions. This is
important, as educational bodies need to be able to adapt to what they will be
required to teach as well as what they teach and to whom. As the change
accelerates, education and training will need to be more sensitive and adaptive
to the changes. This means more accurate predicting of demand and quick
adjustments in supply. I’d go for around half of the Oxford figures with the
caveat that more service jobs will be created, so that the net total will be
10-20%. There will be no sudden shift in months but a gradual bite by bite into
activities within jobs. This is the field I work in, invest in, write and talk about (see WildFire), so I'm not coming at this from the sceptic point of view. AI will change the world and the world of learning but not in the way we think it will.

Excellent insights as usual, Donald. The point you make about AI changing tasks, not jobs, is an important one that most of the commentariat fail to pick up. There’s always a level of complexity underneath the hood that many choose to gloss over, especially when it makes black/white analysis meaningless.