Not making the grade?

It is possible to argue that, whatever those of us in the lecturing profession might think or might like to think, from a student point of view the purpose of participating in a university degree programme is to get the degree – the unit of currency for initial career advancement. In fact, it is not just the degree, but the grade recorded. So for many jobs now, the assumption is that students really need to get a First Class degree if they are to stand any chance of employment in the more sought after jobs.

It is often suggested – and this has been discussed in this blog – that over recent years there has been noticeable grade inflation, with students receiving objectively unmerited marks and with ever larger numbers bunched up near the top of the grade heap. As I have mentioned elsewhere, I am not convinced that this ‘inflation’ is unrelated to performance or merit, but even so it is clear that the spread of marks is not as extensive as it used to be, whatever the reasons. This may prejudice the utility of marks or grades as a tool of differentiation between graduates of different levels of ability.

So is the system of grading no longer useful? Some think so, and most recently Professor Jonathan Wolff of University College London has suggested in the Guardian newspaper that we should give all that up:

‘I’m coming to the conclusion that we should simply issue students with transcripts to record their study, and leave it at that. ‘

Of course there is a whole school of thought that competitive grading of achievement is wrong anyway, and that nobody should be encouraged to think of themselves as more able than anyone else. This is how the issue has been considered in school education:

‘Here are two concrete things teachers can do. First, even if they’re forced to give students a grade at the end of the term, they should avoid putting a number or letter on individual assignments. This helps to make grades as invisible as possible for as long as possible – and therefore minimizes the harm they do when students are thinking about them. Second, teachers can help neutralize the destructive effects of grades – and support students’ autonomy at the time same — by allowing students to participate in deciding what grade they’ll get at the end.’

Seen this way, degrees would become certificates of attendance rather than performance. And as we are moving speedily away from concepts of physical attendance, given the technological alternatives or more generally lower levels of inclination to turn up, they may not be much more than the confirmation that the period of registration for a course has come to an end without the student deliberately dropping out in between. What we need to consider is whether that is sufficient. I’m afraid I don’t think so.

10 Comments on “Not making the grade?”

People attend uni for different reasons some for sport, some to get married. When I lectured at several departments in another university I felt the grade system was employed as a measure of the teaching skill …. At the exam board a table went up and their were the marks for each student by lecturer – it was clear that Forbes minor had talent across the board except in so and so’s class – was it him or was the teaching below par.

“‘I’m coming to the conclusion that we should simply issue students with transcripts to record their study, and leave it at that.”……..£9,000 a year for that! Don’t think so somehow.

“…they should avoid putting a number or letter on individual assignments. This helps to make grades as invisible as possible for as long as possible – and therefore minimizes the harm they do when students are thinking about them”……….Why of why are we so afraid of excellence? Why do we hide behind letter/numeric grading structure? Why not go further? What’s wrong with saying that a student got 84% for his Maths exam instead of allocating an ‘A’ or a ‘6’?

Why does everything and everyone have to be seen as equal in all things when we all know that they are not.

The majority of students attend university to increase their chances of good quality employment. That is, to get ahead of their peers in the competition for jobs. Employers see higher education qualifications as a faster way of allowing them to select candidates for jobs. So if the purpose is one of selection, why not just grade to the curve?

If the purpose is to give employers a faster way to select candidates for jobs then it is not very cost effective for the taxpayer or the graduate. it would be much cheaper and faster to administer an IQ test.

You may be right, Garnet. However, employers may consider that some courses do actually teach some useful stuff (eg. how to program a computer), if not very efficiently, and also they may consider it to be an arbitrary challenge (like running a marathon) which indicates a certain level of commitment and natural ability.

There’s a difference between a certificate of attendance and a certificate of satisfactory completion, though. In most professional and career situations satisfactory completion of a workplace learning course of some kind is considered to be all you need. Did you complete the course and can you now do the things it taught you? Yes or no?

It’s interesting that for all our rhetoric about graduate professionals, university practices aren’t aligned to these future professional horizons. We don’t aspire to train students to understand how the working world functions (perfectly well) without grades; instead we march looking backwards to the value systems of high schools.

The second thing that we do while grading is the work of career shortlisting for graduate employers – we presort the pile for them.

I’m not sure either of these dependent practices are really worth defending.

http://hackeducation.com/2012/04/18/coursera/ This is a good read about Coursera, Vince, also it relates to this post particularly when the students’ peer to peer assessment is mentioned. And yes of course there is a deepseated agenda, as with everything🙂

Lecturers develop a module or course because they are, supposedly, experts in the area and decide course content. Now, in my own case, I *could* just refer students to a text book, and say ‘read that’. But of course, I don’t simply want students to memorise information and regurgitate it into an exam script. I want them to be able to take that material and place it into some sort of historical context concerning the development of ideas within the subject area. I want them to be able to apply that knowledge to problems they may not have encountered, or adapt it for situations outside the narrow ‘examples’ they were given. I want them to see some sort of wider relevence of the topics, or think about the problems yet to be solved. And when I award a ‘grade’ I’d hope it is an assessment of more than just ‘what they remembered from the text’.

This ‘grading’ system is advantageous to all involved. Employers increasingly look at individual grades in individual subjects. Why employ someone in a computational role if every computational mark, as evaluated by the ‘supposed expert in uni’, is poor? It benefits the students who quickly see where their aptitudes lie and where they need to work harder. I’ve lost track of the number of students who weren’t great on the theoretical parts but we magnificent at the experimental parts of our courses. That tells THEM something about where their career directions might lie, or where their interests might be. And, in terms of postgrads, it helps us match the right person for the right research.

I think abandoning that would be crazy. And I think it would harm students more than some poorly defined notion of ‘destructive effects of grades’.