Lloyds TSB recently announced that the move of two thirds of their ICT staff to India was not to save money. The UK throughput of ICT graduates has halved over past five years, is now below that in 1996 and is about to fall further. IR 35 led to the exodus of many of the most able and ambitious independent consultants. Today we see mounting pressures to address our increasing skills shortages (quality even more than quantity) by allowing in more immigrants.

Download this free guide

Your exclusive guide to CIO Trends #2

Access the collection of our most popular articles for IT leaders to help you get prepare for post-Brexit world, make use of “bimodal” IT strategy and blockchain technology.

By submitting my Email address I confirm that I have read and accepted the Terms of Use and Declaration of Consent.

By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.

You also agree that your personal information may be transferred and processed in the United States, and that you have read and agree to the Terms of Use and the Privacy Policy.

70% of the UK workforce in 2020 has already graduated and their skills are atrophying. In the case of ICT skills the half-life is about 18 months – unless renewed. But the updating of workforce skills is outside current political priorties.

Without rapid action to remove the barriers to reskilling, many will soon be unemployable.

It is not just the threat to what is left of the UK ICT industry. The shortage of those capable of supporting computation intensive industries threatens the continuance of the UK as a major location for leading edge research, let alone product development and support, in pharmeceuticals, aerospace and multi-media content production and publishing.

The need is for a crusade, bringing together not only the ICT professional bodies and trade associations, but also the Trades Unions, to remove the obstacles to cost-effective reskilling and updating programmes for those already in the ICT workforce.

For over twenty years, since I ran the National Computing Centre studies into the skills crisis on the early 1980s I have been calling for indivduals to be able to offset personally funded training and mentoring costs against tax and for employers to be able count time under professionally supervised and accredited training, including the structured work expereince that is the most expensive part of most programmes, as “education” – and thus outside national insurance and PAYE.

Politicians have regularly listened but the idea of using tax incentives, as opposed to tax and spend has long been anathema to officials in what is now part of DIUS. Interestingly Treasury was always more receptive and wanted to know about the means of protecting against abuse – rather than dismissing the idea out of had. Indeed the then Chancellor, now Prime Minister, piloted some of the ideas, including mandating “industry-strength quality control” of suppliers, in the ring-fenced funding for the highly successful Millenium Bug-Busters programme.

But we can now see the consequences of nearly twenty years of inaction on the part of what was the Department for Education and Skills/Science.since the recession of the early 1990s temporarily “cured” the ICT skills problem.

So what could/should you do?

1) Visit the CPHC website amd download the excellent paper on the IT Labour Market in the UK, published on 2nd June

EURIM is organising a pilot exercise focussed on security skills, at all levels – with the aim of organising a tightly focussed battering ram to break the logjam by setting precedents in an area which is becoming of critical importance to government as well as industry.

This exercise began on June 4th with a workshop funded by the Cybersecurity knoweldge Transfer Network to start mapping the security skills scene, from definitions through accreditations, qualfications and courses to materials and mentoring.

Join the conversation

9 comments

Send me notifications when other members comment.

Register

Login

Forgot your password?

Your password has been sent to:

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

c/ I'm a British/Australian citizen living in Australia. Although I'm willing to "drop everything" for a good UK role and come over at my own expense - I'm getting nowhere (for context I am a senior BA/PM). The Australian firms love my CV.

The pattern developing is one of ignorance (stupidity?) within corporate management and poor government policy (e.g. IR35)

Given your experience are you able to correct/modify my perception of the situation?

c/ I'm a British/Australian citizen living in Australia. Although I'm willing to "drop everything" for a good UK role and come over at my own expense - I'm getting nowhere (for context I am a senior BA/PM). The Australian firms love my CV.

The pattern developing is one of ignorance (stupidity?) within corporate management and poor government policy (e.g. IR35) exacerbating the situation more than graduate numbers.

Given your experience are you able to correct/modify my perception of the situation?

b) There is a danger that the crisis will be "solved" in the short-term, as in 1991, by crisis. That was the year that graduate recruitment programmes were cancelled and the last time that salaries and contract rates fell. But despite outsourcing and offshoring teh skills crisis of the late 1990s, during the run-up to Y2K, was even worse than that of the mid 1980s.

c) Anecdotal evidence is that the Australians in the UK are going home because of the recession and short-term crash in job opportunities and salaries on offer. A recession is not, however, the same as the beginning of a slump. The latter is not yet inevitable - unless we have to staff the Olympic boomlet by importing skills while contiuing to allow those of the indiginous workforce to atrophy.

I am not convinced that throwing money or tax breaks at academia will solve the skills gaps.

I would prefer to take educationalists by the shoulders and give them a good shaking. They are all doom and gloom, but teenagers themselves, the customers, are all fired up with Facebook, YouTube, Garageband. And bright young things produce Web 2.0 offerings by the cartload. Today, I sense more excitement and intellectual ferment about IT than I have experienced in 50 years. Why can't the establishment tap into the excitement?

My grandson, aged 14, my main informant on these matters, has the answer. When asked by an educationalist what was the difference between the computing he does at school, and what he does at home, he answered in one word "creativity".

That's what's needed, an imaginative leap by academia, and the great and good generally, from the 1970s to 2008. They are all stuck, mouthing platitudes about partnerships between government, academia and industry. To me, the best partnership of that sort was at Bletchley Park, circa 1945. They should all think of something new for a change.

Comment by Philip Virgo:

and in 1945 the Colossi were scrapped, the team dispersed back to industry and academia and technology leadership handed to the Americans ...

Have you visited Bletchley Park recently - they even need a collecting tin to pay for the electricity to run Colossus - and cannot afford to run the Bombes or heath Robinson at all - such is the value we place on those without who the war would probably have been lost in 1943 - with the Germans simulteously winning the Battles of the Atlantic and of Kirsk - both turned by the ten thousand silent geese of Bletchley Park and the (close to) a hundred thousand who ran the monitoring stations and designed and built the equipment in the Post Office, British Tabulating Machines, TSRE and along their supply chains.

I suspect you are actually reinforcing my point - the leading British Universities of today are similarly key players in the supply chains of those producing the global, massively multi-player, multi-media that your grandson uses - but will not be so for much longer.

As in 1945 - we are in process of dismantling that which gave us leadership. Who knows who gave the orders then - but we might hazard a guess that Kim Philby and his colleagues had something to do with it.

I don't blame Philby and his friends. British computer design did carry on for a decade or so after Colossus, with Ace, Deuce, Mercury Atlas and, above all, LEO. The designers had all been at Bletchley or the Radar Establishment at Malvern, and some had gone into industry and some to the universities. They all found it easy to bring together their theoretical and engineering skills, and ushered in the Golden Age of British computer design, 1955-64.

The death knell was sounded when Harold Wilson, full of the white heat of technology, funded "computer science" departments in Universities, and all the newly minted "computer scientists" got snooty about cooperating with low-life engineers in industry. As a great man of the time, Gordon Scarratt, developer of CAFS and DAP, said, "computer science is an oxymoron, because a computer is an artefact, and you can't

have a science about an artefact".

The end-product of computer science became learned papers rather than hardware and software. Innovation and creativity was promptly snuffed out by intellectual snobbery on this side of the Atlantic - except perhaps in Cambridge. Clive Sinclair and Alan Sugar briefly revived it in the '80s, with no help from government or academia. In the '90s, Tim Berners-Lee did great things, but quickly decamped to Geneva and then Boston.

Since then, the UK flame has flickered only fitfully, despite loads of government initiatives and Grand Challenges.

I don't blame Philby and his friends. British computer design did carry on for a decade or so after Colossus, with Ace, Deuce, Mercury Atlas and, above all, LEO. The designers had all been at Bletchley or the Radar Establishment at Malvern, and some had gone into industry and some to the universities. They all found it easy to bring together their theoretical and engineering skills, and ushered in the Golden Age of British computer design, 1955-64.

The death knell was sounded when Harold Wilson, full of the white heat of technology, funded "computer science" departments in Universities, and all the newly minted "computer scientists" got snooty about cooperating with low-life engineers in industry. As a great man of the time, Gordon Scarratt, developer of CAFS and DAP, said, "computer science is an oxymoron, because a computer is an artefact, and you can't have a science about an artefact".

The end-product of computer science became learned papers rather than hardware and software. Innovation and creativity was promptly snuffed out by intellectual snobbery on this side of the Atlantic - except perhaps in Cambridge. Clive Sinclair and Alan Sugar briefly revived it in the '80s, with no help from government or academia. In the '90s, Tim Berners-Lee did great things, but quickly decamped to Geneva and then Boston.

Since then, the UK flame has flickered only fitfully, despite loads of government initiatives and Grand Challenges.

Comment from Philip Virgo:

All save LEO and ATLAS done in Universities.

Bletchley was indeed a great melting pot - but so too are all the best (usually funding from insutry rather than government) applied research centres.

Unfortunately so many of the best of the talents nurtured by our better Universities are now from overseas (because without their full fees the Universities would be broke) and they increasingly return home afterwards - because that is where the discretionary research funding and after-tax life styles are.

One of my regrets is that when I was accepted a post as a Graduate Engineer with STC I did not take the opportunity to become one of the first Chartered Engineers. By the time I came to apply based on evidence of practical experience no-one was available to do the assessment and I gave up. I should perhaps add that my first degree was in History - which taught me to test the "evidence" before I even began to look at the logic. However, my training in practical systems engineering (including helping reverse engineer IBM BOMP so as to do a parts explosion in 6 hours instead of 48 - our chief Programmer then got it down to 90 minutes so we could do a factory rescheduling over lunch instead of over the week-end) was not only great fun, like super crossword puzzles, but also gave me a healthy respect for computer science - done well.

However, you cannot produce good systems if the specification is crap and you are not allowed to do the job properly - and that requires educatiing users and managing expectations - hence the reason I graduated to "political engineering".

I have to agree with Michael, there is always this great cry of 'skills shortage' but when you present an agency (and all recruitment is done via agencies these days) with an exact match of skills plus more they go strangely silent when it dawns on them you are over 40.

However, in my experience (over 35 years) there have been some great British technology companies (GEC Computers, ITL etc) and it is a shame that we have thrown away the IP of hard won skills. I was fortunate to have the chance to do a CompSci degree as a 'mature' student but I disagree that those skills are obsolete after 18 months. This is education not training - big difference. I do wish industry would realise that what matters most is not some arbitary basket of proprietary skills but the ability to learn new things! Another aspect to the problem is the simple lack of professionalism and muturity, from recruitment onwards.

Ian

Comment from Philip Virgo:

I should have expanded the point on the half-life of skills being 18 months: "education is what is left when you have forgotten what you learnt after taking the exams".

You are quite correct: the disciplines of, for example, systems analysis and structured programming do not go out of date. Specific programming tools and user interfaces come and go. When I entered the industry (40 years ago) there was a rule of thumb that you needed at least ten days ful time training a year to keep up to date. I am told that still holds true. Then most of us received at least two weeks of courses, with seminars and programmed instruction texts additional. The last survey I saw indicated that less than 10% receive that level of updating today - even if you include the time spent on "on-line webinars et al.

I think it's quite illuminating to peruse the respective monthly journals of the British Computer Society - most often journalistic style articles on maximising ROI/Security/whatever in IT for businesses, and those of the ACM or IEEE - interesting (and often difficult) articles and papers of a quite serious nature on areas of computer science research and development in an enormous range of fields.

I think the differences between these journals perhaps highlight a UK attitude towards IT - that the only thing that really counts is the cost of IT and how much profit business can make from it. I'm sure I read recently in a BCS publication (that's was mostly filled with articles on ROI etc) that suggested the society needed to focus much more on Business in order to cater better for it's needs (to lower cost and increase profit).

What with lower wages, and asian cultures that are genuinely interested and good in maths and science, it's hardly surprising that businesses are taking their IT offshore in order to maximise their ROI.

Having graduated in 2000 with an Engineering degree in Multimedia and Comms Systems (most employment agencies and employers don't know the difference between a BSc and a BEng), I initially found it very easy to get a job.

After leaving that job 6 years later without any IT training I struggled for a year to find work.

Eventually I took a lower position with a lower salary and while it looked like they offered training albeit in management, that soon passed by the way. After being offered another slightly lower position but with training I was able to negotiate some time to self study with my current employer, but formal funded training was a definite no. Its better than nothing, but I really struggle with self study as I like to ask questions and understand the 'whys' and 'wherefores' and after years of informal self study it doesn't fill me with enthusiasm.

The problem I find is that I seem to end up working for small companies whose last priority is to train someone whom in their eyes should already know what they are doing.

Additionally the self study 'multimedia' materials I have invested in have been less than impressive. A PRINCE2 multimedia course that cost the best part of £1000 started well but tailed off after less than 40% of the needed materials to pass the exam had been covered. I haven't bought any multimedia based training since. I find myself desperate to return to the class room to learn so I can ask questions!

I am faced with a choice...

Stay in IT, struggle to renew my skills and become ever more unemployable, while the industry continues to consolidate and outsource IT compounding the problem.

Stay in IT, find a great employer who understands the need for continual training in IT. This is the route I have been pursuing for the past few years without much luck.

Move to another country where my skills might be better understood. Perhaps I should move to one of the countries where the UK is outsourcing to?

It appears from a number of comments here that there is something of a difference between perception and reality. My main concern is Ageism - it is rife in the UK IT sector. This attitude is a conscious barrier that prevents those over 40 obtaining a new job. On the one hand we have a supposed skills shortage and on the other longterm unemployed IT professionals who are over 40. The single most significant blockade that prevents IT professionals returning to work is age discrimination. It must be stopped.

Comment from Philip Virgo - ageism has been rife in this industry for forty years.