Main menu

Post navigation

Scrapping “ICT”

On his Spannerman blog on 11 January[1], John Spencer announced that “BETT opens as ICT is scrapped by Gove”. This was a misleading title. What Michael Gove scrapped was “the current, flawed ICT curriculum”[2], which he called on the industry and awarding bodies to replace. Mr Gove made it clear that “ICT will remain compulsory at all key stages”—and he himself continued to use the term “ICT” throughout the speech.

I am not reporting here that ICT has been scrapped, but arguing that as a term “ICT” ought now to be scrapped—and that the changes being initiated by Mr Gove probably will result in this happening. An article today by the Guardian’s Digital Literacy Campaign uses “ICT” five times, “IT” six times, and “computer science” five times. This shift in the use of terminology will ultimately change the way we think about what we are doing.

According to his website, Stephen Heppell “is credited with being the person who put the C into ICT”[3]. According to Wikipedia, the first official recognition of the acronym came in the Stephenson Report of 1997—a report commissioned by Tony Blair and on whose panel of seven, Professor Heppell doubtless wielded a disproportionate influence. Becta was established in the following year (also with that tell-tale “C” in its title), in effect to be the executive agency tasked with implementing Stephenson.

The term “ICT” has therefore:

been driven by an educational agenda, and not by use in the wider technical community;

coincided with a Labour-initiated and Becta-managed project of educational reform.

Right from the start, the rationale for the new term was confused. The Stevenson report merely noted that it “seems to us accurately to reflect the increasing role of both information and communication technologies in all aspects of society”. This misses the point. For an educationalist like Professor Heppell, the significance of the change lay not in the fact that children were now going to study modern communications technologies—but in the fact that they were gong to use them. In this sense, putting the “C” into “ICT” meant moving away from a didactic or “instructionalist” style of teaching. Communications would not principally be about accessing the teacher’s expertise, but about peer-to-peer collaboration, enabling styles of learning in which the teacher played an increasingly marginal role. It was about changing the orientation of education from the vertical to the horizontal. The “C” was about Web 2.0, and on his blog Professor Heppell celebrated “community, collaboration and creativity as ‘C’ words too”.

Most progressive teachers welcomed this as a step forwards, particularly when seen against the background of Computer Based Training, the orthodox style of e-learning in the 1990s. This had generally offered learners a repetitive and uninspiring pedagogy, in which they were shown a succession of expositive pages, followed by a multiple choice quiz to check that the previous factual material had been thoroughly absorbed. But although “ICT” represented a significant advance in thinking over the old “CBT”, we should not allow our thought to be trammelled by the perception that the world is made up of binary antitheses: that x must be good just because y is bad. As the master of the sound-bite himself had it, we should always be looking for the third way.

When Professor Heppell claims that “C” stands for “community, collaboration and creativity” as well as “communication”, he unwittingly revealed part of the problem. While “communication”, “community” and “collaboration” fit together comfortably, “creativity” is an awkward fourth member of the alliterative patter. Highly collaborative, consensual societies have nearly always been bad at creation and innovation: most great acts of creativity have been the result of individual insight and endeavour and have frequently been fiercely opposed by the so-called “wisdom of the crowd”.

The lionising of communication has also fitted into a pattern of progressive thought which has deprecated the importance of knowledge. In the late 1980s, it became fashionable to promote the importance of skills in opposition to knowledge, another false dichotomy which ignores the fact that “knowing how” to do something is just another form of knowledge, and is often dependent on a considerable amount of “knowing that”.

The current fashion (as Mr Gove remarked in his BETT speech) is to suggest that personal knowledge is no longer required as everything can all be found on the internet. Not only does this create the sort of “single point of failure” which would delight writers of dystopian fiction (I write this on the day that Wikipedia has been taken down as a political protest)—it is also nonsense. As Gove argued in his March 2011 speech to the SSAT, if you were to get on a plane and ask the pilot “Tell me, do you know how to fly this plane?”, you would not be happy if he replied “No… it’s all there on Google and Wikipedia, isn’t it?”[4].

Not only does learning by doing need to be balanced by the acquisition of knowledge; but “doing” includes many forms of activity other than communication: individual activities like competitive gaming (the forgotten “C”), using creative tools, and undertaking research. Education needs to balance many things: creativity and knowledge, group-work and individual endeavour, divergent thinking and academic rigour. Virtue is a middle way[5].

An acronym which privileges one type of pedagogy over others may not be particularly helpful in this respect—but this is not sufficient reason to abandon a term which has achieved common currency. The etymology of a word or acronym is not that important: who really cares whether we talk of “ICT”, “IT” or just “technology”?

What matters about words, I suggest, is not their etymology but their granularity—the degree of precision that they allow. This is the important test that “ICT” fails because it refers to two quite separate things:

the teaching of ICT as an end of education;

the use of ICT as a means of education.

Technology can be used (for example in the form of interactive whiteboards or MIS) without students necessarily understanding how they work or even using these technologies themselves; while large parts of computer science are commonly taught using traditional blackboards and textbooks.

The failure to distinguish between technology as a means of education and technology as an end of education reached its high-water mark in the Rose Review, which was predicated on the fact that ICT was “’an essential skill for learning and life’, comparable to literacy or numeracy”[6], lamenting the fact that the “use of ICT is not sufficiently embedded in curriculum goals and design”[7]. In this model, there is no significant distinction made between using ICT and teaching ICT (or more generally, between what should be taught and how it should be taught). From this perspective, the failure to use ICT successfully in schools should be blamed, not on the lack of appropriate strategic leadership from Becta, but on the weak “digital literacy” of the poor bloody infantry.

It is true that the teaching and use of ICT in schools might be synergistic; but so are they frequently antagonistic, as when children use their ICT skills to produce plagiarized work, are distracted by off-task social networking, or focus on prettifying their work rather than on its academic content.

The relationship between ICT-as-a-means and ICT-as-an-end of education needs to be carefully examined—and this cannot be done unless we have two different words for the two different things that we are talking about. By using the same “ICT” term, Becta consistently failed to deliver this kind of clear thinking.

The muddying of the waters may well have been deliberate. Becta’s justification for its own existence came increasingly to rely on its role in delivering “aggregated procurement”, which it (falsely) claimed to be saving the taxpayer a great deal of money. ICT was driven by supply-side interventions. Large amounts of hardware were pushed into schools, (often by government contractors with no real educational background) and with little clarity on how it was to be used. Many argued that such a justification was unnecessary: ICT was just a good-in-itself, an up-to-date, 21st century sort of thing. The fact that ICT required no justification suited Becta very well.

Second, ICT became increasingly identified with the attempt by campaigners such as Professor Heppell, Sir Ken Robinson and Lord Puttnam, not to improve the efficiency of education but to change its goals. Teaching people “twenty-first century skills” would not just so much improve their learning in other subjects but change the nature of the curriculum, moving the emphasis away from the acquisition of knowledge, away from traditional concepts of academic rigour and excellence that only the best can attain; and towards self-expression, a world in which everyone’s opinion was equally valuable, and a conception of education as an intrinsically egalitarian project.

I shall be examining this perspective in more detail in future posts—but it is sufficient here to note that all these campaigners urged not only the introduction of ICT as a means of improving learning, but have also suggested that the benefits of this approach will only become apparent when we have changed the way in which those benefits are evaluated. They tacitly admit that they cannot show that ICT is improving learning when this is measured against traditional objectives, traditionally assessed. In these circumstances, the intellectual fog created by the poorly defined term “ICT” provides useful cover for a campaign which has been at heart political and not technological.

So long as the education technology community continues to advocate wasteful processes of aggregated procurement and politicised campaigns for educational reform, it cannot expect to be taken seriously by a government which is in favour of cutting bureaucracy in its administration, and of re-introducing academic rigour into the curriculum. If it is to make the case effectively for more technology in schools, the community would make a good start by adopting language which allowed for a clearer discussion of what it was trying to do, and how it was trying to do it. To this end, I suggest that it is time to scrap “ICT” as a term, both for its ambivalence and for its political baggage, and speak instead:

of “education technology” for the use of IT to improve standards of learning;

of “computer studies” for the teaching of IT as a part of the curriculum.

[5] A theory advocated by Aristotle in Book 2 of his Nicomachean Ethics (http://classics.mit.edu/Aristotle/nicomachaen.2.ii.html). This perspective contrasts favourably with the more common dualist perception of good and evil, advocated by monotheistic religions and twentieth century ideologies, which encourages enthusiasts to push what were originally sensible positions to the point of excess.

11 thoughts on “Scrapping “ICT””

Good post. I agree that we need to be clearer about our terms – but think we need even more granularity than the two terms you suggest provides. My preference, drawing on the Royal Society Report on Computing in schools, is for five (yes 5, really) terms:

ICT : this is the National Curriculum subject. It is currently statutory.

IT : this is a subject usually taught in secondary schools at GCSE and A level. This usually involves the application of software (often in business contexts) and might be framed as ‘teaching Microsoft Office’.

Computing or Computer Science (CS) : this is a discipline, which includes system thinking, algorithms/heuristics, data structures, programming, etc.. It is not currently widely taught in schools.

Digital Literacy (DL) : this is about being an intelligent user of new technology. It would encompass understanding how technology impacts on society, eSafety, and how to use technology effectively (e.g. searching the internet).

Embedded Technology (ET) : this is about the use of new technology within other subjects. It reflects the fact that new technology changes the nature of disciplines – whether you are a mathematician, scientist, geographer or sports person what you do and how you do it (in the real world) is different as a result of new technology.

Give our limited resources in education – schools will never have enough resource for all the IT kit and associated professional development that they need – I think we need to think carefully about what our priorities are. There are lots of rationales for using new technology in education – explore your views about which are the most important, and help move on the debate, by filling in a short questionnaire (it only takes 9 minutes on average but may still challenge your thinking) – http://med8.open.ac.uk/dictated/start_questionnaire.php

I am all in favour of the higher granularity that you suggest. But taxonomies are often hierarchical: Alsatian is a type of dog is a type of mammal is a type of vertebrate etc. And it is important to get the distinctions right at every level of the taxonomy. So I would stick with my point that the distinction between *using* technology and *teaching* technology is the fundamental, top-level distinction that we need to make.

Of the terms you suggest, ICT, IT, CS and DL are all related to the curriculum. I really don’t understand what is meant by “embedded technology”. I *think* it means teaching technology in a cross curriculum way (and this is the only sense in which the term is used in the main body of the Royal Society report).

Would you call the school MIS “embedded” – probably not, but it is still a form of “education technology” which has, if properly exploited, huge potential for improving the management of schools.

Is an interactive whiteboard “embedded”? And if “embedded” means “being on the school premises”, then what does something look like when it is *not* “embedded”? Maybe it just means “in use”, rather than being left in the cupboard or on the shelf?

So it seems to me, Peter, that your comment confirms my point. All five of your categories are about technology in the curriculum (which is explicitly the subject of the Royal Society report) and none of them seem to me to be about the *use* of technology to improve learning.

It is this lack of a term which we can use to discuss the *use* of technology which I suggest is skewing the debate something rotten. It reminds me of the planet invented by Douglas Adams where everyone was euphorically happy for the simple reason that, in their language there was no word for “unhappy”.

With regard to cost implications, I think that there are huge cost savings to be made by the intelligent application of education technology. I shall write a separate post about this – but the essential argument is that the human resource in education is very stretched and inefficiently used – and I believe that effective new education technologies will revolutionise the delivery of education. But this will never happen long as “technology” is seen entirely as a curriculum issue.

So I don’t want to get too much into the curriculum debate, which is not my primary interest or expertise—but it is on this that Gove has fired his starting pistol. I think the type of distinctions you are making, Peter, will be important. From my position on the sidelines, I would make the following quick points.

1. I remember going to a teachers conference at Cambridge where the admissions tutor for Law said “don’t teach Law at at school, teach Latin and History”, and the Computer Science tutor said “don’t teach Computer Studies at school, teach as much Maths as you can”. So, quite apart from issues of resourcing hard computer science at school, I am not sure of the academic case for it.

2. Encouraging more coding is another matter entirely – the case for this was made (to my mind) very convincingly by Seymour Papert in Mindstorms, in about 1980, I think. I believe Papert went on to act as patron of the modern Lego robotics systems, but these do not strike me a highly accessible for the average child or teacher, and so new, simpler approaches are needed for the mainstream curriculum. The development of mobile apps seems like a gift – and I saw one stand at BETT selling what looked like a really simple, accessible programming environment.

3. I suggested “Computer Studies” as the abstract term to cover all aspects of the technology-related curriculum, but when you have looked at all your sub-categories, Peter, you may well find a better term. The Royal Society report refers just to “Computing”. My only condition is that it should be clearly distinguished from “education technology”.

This is well argued but I’m not sure there is anything new here. I forget where it originated, but even before ICT, there was discussion about teaching ‘with’, ‘through’ and ‘about’ IT. In my mind I’ve always equated ICT with ‘teaching through’ and IT as ‘teaching about’. ‘Teaching with’ to my way of thinking allies with IWBs, CAL etc. I agree that there should be more clarity but I’m concerned that agonising over labels is a bit of a distraction.

I agree that there is nothing new in what I am saying and also that it should be pretty obvious. But read the Becta Harnessing Technology strategy documents and the Rose Review, and you find that this obvious distinction is consistently fudged. In my future posts I shall hope to show how this has led to some very warped policy making (and is still doing so).

I don’t think it matters what the label *is*. I remember a rather good exchange on the Security Council, in the build up to the Falklands War, when the Argentinian Ambassador made a long speech about how he was refusing to call the British Ambassador “Sir” Anthony Parsons, because of the title’s colonialist implications, to which Parsons, laughing fit to bust, said “you can call me whatever you like”. The label does not matter – but the distinctions between the labels do. If someone says that “Anthony is a colonialist oppressor” and there are two Anthonys in the room, then you have a communications problem. And the problem with “ICT” is worse than that – because many people do not even recognise that there is a second Anthony – and this leads to the sort of simplistic thinking that was the curse of schools ICT under Becta.

You say you use “ICT” to mean “teaching through” – but Peter Twining has just confirmed that in schools, “ICT” is a compulsory part of the curriculum. Q.E.D.

Of course, the terminology may be used rather differently in H.E – so I should make clear that I am writing this blog from the perspective of ICT in schools.

Crispin, I think that two paragraphs of an article I’m writing sums it up:

“Gadgets – the future of eLearning?
“Jay Cross (p175) writes. ‘Designers deem a dress a success if people say that the woman wearing the dress is beautiful.’ In contrast, when I chauffeur visitors around the town they often say to me, ‘What a beautiful car!’ Whether the dress is ignored or in my case they enjoy the drive and ignore the skill of the driver, Jay Cross is right when he says, ‘Similarly, eLearning will be successful when it is no longer noticed.’

“We are gradually moving towards the day when eLearning will be both ubiquitous and utilitarian, when neither teachers nor learners will think of boasting about their Blackberry or Tablet. – The same is true for the label ICT. As long as ICT is seen as a set of tools we must learn to use or, if we consider ICT as a subject to learn, we have lost the plot. When ICT becomes transparent in use, when each individual is naturally creative and communicative, we will no longer be looking at ‘the dress’ or ‘the limousine’ but will be more clearly understanding that which we set out to learn or appreciating the person we are trying to relate to.”

At the moment education is in an ‘intermediate state’ where we are still explaining to generations the purpose of ICT. When ICT becomes ubiquitous, when a learner is allowed to choose whatever tool they prefer they will no longer think of ICT as a subject or ‘something to do’ as much as we do not think of a pen or pencil as something ‘to do’.

If thought leaders such as NAACE will accept this intermediate state theory we should, therefore, be looking towards a very different sort of curriculum.

I agree that technology needs to become more invisible and more intuitive to use (and would say that it is already doing so). I think that is normally what people are talking about when they say that they don’t want the technology to get in the way of learning.

But I do not think that to have technology which is on the one hand useful and on the other invisible and intuitive, you need less but rather more technology. The less that appears on the dashboard, the more there is beneath the hood (given a car with a comparable performance).

I see the intrusiveness of much current education technology as being to do with the failure to develop education-specific technologies which can automate education-specific processes. And I agree with your “intermediate state” proposal, at least in the sense that it speaks to me, in that the more invisible and intuitive the technology becomes, the less emphasis we will need to put on training teachers to have special skills to use it. The requirement to use an i-phone is minimal. Which is why I think Michael Gove was being a bit simplistic when he suggested, in both his SSAT and BETT speeches, an antithesis between computer hardware and teacher skills. But this is the subject of a future post!

We are now further along the line of consultation about the ICT Curriculum and what it’s constituent parts are and what they should be called. But I thought I’d reflect on a couple of the points posted here.

The C for “communications” in the curriculum partly came from pressure from the telecommunications side of the IT industry which at the time was very distinct and exerted considerable influence.

Becta had no responsibilities for the ICT Curriculum and worked with all the subject associations, including Naace, to harness technology to enhance education.

The “ICT Mark”, whose old logo heads this blog, is testament to this because it (and the self-review framework) are much more concerned with the effective, efficient and sustainable use of technology across a school than it is with the ICT Curriculum – which only made up one strand of one element. (Which is partly why it was rebranded with “Next Generation Learning”.)

I take Crispin’s point about hierarchies for example “computing” might be considered as sub-set of “computer science” which is part of what is currently referred to as “ICT” but that doesn’t mean that a school might not teach aspects of computer science or programming through mathematics or science.

What is emerging from current discussion is that it is not so much the ICT programme of study that is discredited but how teachers and schools are teaching it in a selective and out of context manner.

Yes, we should welcome the calls for greater emphasis of programming and computer science within the curriculum and in particular in the qualifications that are available at KS4. However the current government proposals to disapply the PoS sends the wrong message to schools. It is like disapplying the Highway Code because some people drive dangerously. The consequence will be that some schools will reduce curriculum time for ICT (IT, CS and DL), teachers teaching ICT badly will continue to teach ICT badly and people with skills in ICT (IT, CS and DL) will be discouraged from becoming teachers because of ICT’s degraded status in the curriculum.

Management information systems, communication systems with parents, etc. don’t need to be considered as part of an ICT Curriculum.

My simple suggestion to ensure that all those schools (particularly primary schools) who have done great things with ICT continue to do so, while incorporating the Royal Societies suggestions, would be to turn the “C” of ICT into “Computing”.

We’d then have curriculum strands that teachers could identify with in developing a new curriculum – Information (IT), Computing (CS) and Technology (DL + some of the higher order DL skills and Technology Enhanced Learning that is not included within the Royal Societies definitions)

Good Thinking but not sure of your term DL (not in my vocab book). The ‘C’ of ICT came about when David Blunkett came back from a trip to America (1985 I think).

Also, I’m not so sure about teaching programming as such. I think that I have taught myself almost 20 languages in the last 30 years. Starting with the Commodore PET and BBC Basic, through Comal, Cobol, Fortran, Lisp, Prolog, VB and on and on. I gave up at C++. I would argue that although we may teach the basic principles of good programming (including the 4 distinct types of documentation) I am not sure that we should be teaching potential programmers any specific language until they reach university.

BUT, despite any concerns individuals might have, I feel that it is essential that Heads of ICT get their act together and, with support of our Associations, take a much firmer lead in the overall management of ICT in schools. It is a far-reaching ‘subject’ and needs the quality staff to lead it.

I am interested in your view that the “C” came from telecom companies. I am slightly surprised that they were so involved. You think Heppell’s claim is false and his position on the Stevenson enquiry irrelevant?

I won’t comment in detail on the ICT curriculum, regarding which I am only an interested observer. I am sure that there was plenty of good practice which needs to be preserved – but in my indirect experience, an awful lot of it seemed pretty dull and I think the radical decision to shake things up is a good one.

But for me, the really interesting part of Michael Gove’s speech at BETT was not his overturning of the “dull” ICT curriculum, but his call for “a serious and intelligent discussion of how technology is going to transform education”.

My argument is not about the scrapping of the curriculum but about the scrapping of the term. It is that under the Heppell/Becta “ICT” orthodoxy, the conversation about the transformation of education was constantly eclipsed by the assumption that you achieve this by *teaching* technology, rather than by *using* it, and by the somewhat lazy assumption that everyone knew what technology was (browsing, blogging and twittering), rather than recognising the need to discover and develop the *right* technology for this educational purpose.

Two and a half months after Mr Gove’s call, I still do not see any evidence of such an intelligent and serious conversation happening. The priority still seems to be on the teaching of ICT as a subject.

Crispin
I agree with your statement about the need for there to be recognition of the two areas of ‘education technology’ and ‘computer studies’ (or whatever term we finally decide on) are looked at in schools. Are we not though, in fear or creating a complete divide between these areas if we’re not careful and is this all that the rhetoric is doing at the moment? We seem to be in the process of swinging from one end of the scale to the other without stopping to look at what is in the middle simply to respond to the political whim of a policy maker rather that taking a real look at what students need. It isn’t an ‘all or nothing approach’ that is required but it has to provide the mix that gives and excuse the old term but a ‘broad, balanced and relevant curriculum.’ This appears to be what has going out the window again as we saw the introduction of easy to use digital video and audio editing and the aspects that are now being asked for disappeared because it was easy to produce ‘high quality outcomes’ that students appeared to enjoy. In some, not all cases, this was just the same as we saw in the early days of the introduction of technology, like adding clip art to make items look good without any real thought as to what was going on – ‘I will do it because I can’. If continue with the drive to move solely towards either end of the spectrum we will miss out so many of the things that learners need to come to terms with at the exclusion of the others. Can’t we find the middle ground, surely this does exist? Schools where the subject is taught effectively, with a full range of content giving students a full insight into what can be achieved as well as providing opportunities for its application to help increase the quality of learning are those that have succeeded. We need to move away from the ‘one size fits all’ approach we have seen being delivered but start to look at the needs of the students and not simply the needs of the school.

David, Thanks for the comment. While I do think there is a need for a serious correction – I also agree that, but whenever this happens, there is always a danger of making an over-correction and setting up a pendulum effect. Nor, in drawing the distinction between “eduction technology” and “computer studies” (or whatever), do I wish to suggest that there may not be important synergies between them. But there might also be antagonisms – so we need to approach this sort of question with care and intelligence.

I think we should be slow to blame Mr Gove for all our woes. I imagine that being Secretary of State is a rather less omnipotent position than we outsiders like to imagine, and more like steering a supertanker with a rather wonky rudder. Whenever someone like Mr Gove makes a speech about ICT, it is interesting to me how the press and most of the community hone in on the curriculum and seem completely unaware of the instrumental potential of technology in improving education. Mr Gove asked for a “serious and intelligent conversation about how technology will transform education” and he cannot be blamed for the fact that the conversation on SchoolsTech.org.uk was – in the words of Merlin John – so “lacklustre”.

So I agree with where you are coming from (or perhaps, more accurately, where you are trying to go to), but we might disagree about some of the moves required along the way.

For example, I am sceptical about training as a lever for change. The expectation that every rank-and-file classroom teacher is going to adopt a hugely reflective stance towards their use of technology has not been borne out by experience. That is why I think that more of the “intelligence” you speak of (and which may have been developed in the best schools) needs now to be encapsulated in the software, which should be easy-to-use out-of-the-box so that it can be replicated. In my view, the skills that are required by every teacher and learner, and which might be covered by the term “digital literacy”, are of a fairly simple kind: the digital equivalent of knowing how to use the library. When looking at what has undoubtedly been achieved over the last decade, I think we need now to think about how this can be scaled up and replicated across the whole of the education service.

This is where my argument leads onto the need to encourage a vibrant, innovative education software industry – something which has been repeatedly undermined to date by bureaucratic meddling by Becta.

How an appropriate curriculum and an innovative education-specific technology market will help support is then the key question which I think you are focusing on, David. And there I also agree with your argument that we want to steer away from a “one size fits all” prescription. Not only (I would argue) because people are different, but because we must leave space for experimentation and innovation. And is that not the direction in which the current government is taking us?