At Level 4 in the English Programme of Study for ICT pupils are expected to be able to question plausibility, whilst at Level 5 they should be able to check for accuracy. What’s the difference, and are these even indicators of digital literacy?

Questions, questionsLet’s take that second question first. Generally speaking, the skills found in the ICT curriculum are also found in other subjects’ curricula, which leads some people to question whether they are even ICT skills at all. How can you tell, for example, if someone is demonstrating digital literacy as opposed to, say, numeracy? I think that is mainly a matter of context, and partly a matter of degree.

In the case of plausibility and accuracy, we see the need for these skills in everyday life – but that doesn’t necessarily mean we should exclude them from the ICT Programme of Study. For example, when I go into a supermarket, I keep an approximate tally of what I’m spending as I’m walking around the aisles. I do that by mentally adding the left-hand figures of the prices, ie the pounds. When I arrive at the checkout, I then mentally add a few pounds to cover the right-hand column, ie the pence. What I’m doing, in effect, is putting myself in a good position to judge the plausibility of the final bill. If, using my rough and ready approach, I estimate that the bill should come to around £17, I’m not going to be too upset if it comes to £20. If, however, it comes to much more than that, I will want to know why. In the terminology we’re examining now, I would question the plausibility of the data presented to me.

Accuracy, on the other hand, would, in this example, involve using a calculator or doing some spectacular mental arithmetic. I would not see these as ICT skills as such, although I suppose that, at a push, using a calculator might be construed as ICT, albeit at a low level. Certainly not a Level 5.

So how do these concepts translate into an ICT context? The question of plausibility arises every day for almost everyone with an email account. When I receive an email informing me that I won a lottery that I have never heard of, let alone entered, I have to ask myself if this is likely to be genuine. Similarly when I receive an email from a bank I don’t bank with asking me to provide password details. More insidious is where the sender appears to be PayPal, in which case I hover my mouse over the URL or email address to see if they are really what they appear to be.

You might think such precautionary measures are obvious, but thousands of people are fooled every year, and they are not necessarily uneducated or elderly. Being able to question plausibility in a digital context is clearly a very important skill, and not one which we can assume everyone to possess.

Being able to check for accuracy in an educational technology context has to be more than tapping some numbers into a calculator. For me it’s about setting up some sort of cross-checking facility. For example, I have a portion of my invoice spreadsheet template that calculates VAT (sales tax) backwards. For example, if I invoice a client for £100, the VAT will be £20 and the total amount £120. My VAT checker will tell me that the original amount was £100. You may think this is pointless, especially considering that I have to enter the final amount separately into the VAT-checker (otherwise it would simply hide any error), but what it means is that if I accidentally press the wrong key when entering the invoice amount, even if I don’t notice the error there, with any luck I’ll do a double-take when I lrun it through my VAT-checker. Obviously, this involves an ability to question plausibility as well. If my final invoice comes out as £80, it would suggest I’ve inserted a minus sign instead of a plus sign in a formula.

In other words, checking for accuracy means putting into effect some checks that will, one hopes, catch errors. But how could this tie in with one of the key elements of Level 6, that of efficiency? In my opinion, that is taking steps to prevent the error arising in the first place, and bring errors to your attention when they do occur.

For the former, I would suggest that an understanding of data validation techniques is called for. These can be simple. For example, in Excel you can set up an area of the spreadsheet where you can only enter, say, numbers. You can also set up the spreadsheet in a way that prevents anyone from entering data in particular cells. Advanced users could even set it up such that data is entered from drop-down lists, thereby avoiding keyboard errors altogether. You can do this sort of thing in Word too, and no doubt in other, non-Microsoft, office applications.

For the latter, I would expect to see something which tells the user there is something possibly strange going on – perhaps conditional formatting in a spreadsheet, or a box which causes the word count to pop up in a document. In other words, efficiency is partly about making the document or whatever proactive to some extent.

Accuracy and efficiency also involve an underlying digital skill, that of a basic form of systems analysis. For example, you have to try to work out:

What kind of inaccuracies might arise?

How can they be prevented from arising?

What can be done about them if they do arise?

Such analysis will also need to take into consideration the likely skill level of the person using the document, ie the needs of the audience, which is addressed at lower Levels, notably 4 and 5. It will, or should, also involve collaborating with others, and even testing out different approaches with people from the target audience.

Ultimately, what we’re aiming for are students who can look at a set of data, in whatever form it happens to be, and be able to ask how feasible it is, check if it is actually true, and be able to minimise the chances of mistakes arising in the future both by themselves and others.

Related articles

Reader Comments (4)

A very thoughtful article Terry and good to see this linked to the importance of developing digital literacy. Your analysis towards the end is particularly helpful although I would also include the wider context of information as well as data. The old maxim is easy to demonstrate with the spreadsheet, garbage in will produce garbage out although the functions work the numbers need to make sense. Think about the Mars lander, now did we design in feet or meters chaps?

Information can be even more tricky. Alan November demonstrates this very effectively using a holocaust denial article posted on a personal page of a university web site. Without the key digital literacy skills; questioning plausibility, checking information and understanding the digital context of the site, the learner lacks the validation tools and techniques you refer to.

Unfortunately we too often assume this literacy from our 'digital natives' mistaking confidence for competence.

Thanks for this, AT. Yes, I agree with all you say. I did focus on the data side of things, but information literacy is crucial too, as you rightly point out. I also agree with your final comment. In fact, in an article on this website, Edith, who was then 14, talked about how she and her friends felt they were being "undertaught" in ICT because their teachers assumed they knew it already. The article is here:

I, too, have found that students don't always examine their resources very well. So often it seems that my students find the answer on one site and then don't bother to verify it with other sources. Any thoughts on how to encourage students to take more time with their investigation and more critical evaluation of their sources?

It's an interesting question, Melanie. I suppose one approach might be to devise a rubric in which it's clear that if students don't prove they have consulted several sources they don't get full credit. After all, you wouldn't base a career decision on just one source of information, eg if the company brochure says it's a wonderful place to work, would the student take their word for it?!