“It’s interesting that the reaction to open badges from senior academic managers is often to dismiss them as being child like and akin to collecting a badge for sewing at scouts.”

…and…

“I also suspect that traditional higher education providers will resist providing them because they don’t fit in with traditional academic perceptions of achievement and credentialing.”

I wonder if these academics have consulted their own faculties of education?

Of course, open badges and college degrees are not mutually exclusive. If a particular university can overcome its initial prejudice, it will see badges for what they really are: representations of achievement – just like those pieces of paper they dole out at graduation ceremonies.

There is no reason why a university couldn’t award a badge upon the completion of a degree. In fact, it could also award badges upon the completion of individual subjects within the degree. That would give the student a sense of accomplishment while in the midst of a multi-year program, and I imagine showcasing one’s backpack on the university’s VLE would become rather competitive.

Speaking of competition, I don’t see open badges as a serious disruptor of the higher education system in the way that MOOCs are. And that’s because MOOCs are disrupting the delivery of education, rather than its credentialing.

A degree will always command a certain level of gravitas. It represents a structured, comprehensive education from – according to broader society – an elite bastion of knowledge and research. In short, it equips you with the intellectual foundation to do something in that domain.

In contrast, open badges are task oriented. Beyond the nebulous notion of “study”, they recognise the execution of specific actions. For example, Mozilla issues its Div Master Badge upon successfully using the div tag at least 2 times in its Webmaker Project.

If the task were passing an exam, the badge could indeed represent the acquisition of knowledge; but the spirit of open badges dictates that the task be performed in the real world, and hence represents the mastery of a skill. And this is meaningful to the corporate sector.

For example, if I were an employer who needed a graphic designer, I would seek someone who knows how to take awesome digital photos and edit them in Photoshop. So an applicant who has earned badges for digital photography techniques and advanced Photoshop operations would be an obvious candidate.

Yet if I were seeking a IT executive, I don’t think open badges would cut the mustard. Sure, badges earned by an applicant for various Java programming tasks might be attractive, but a wide-ranging role requires the kind of comprehensive education that a degree is purposefully designed to give.

When we look at learning through the lens of the college degree, we see its application in the future tense. The learner has a well-rounded education which he or she intends to draw from. In other words, the degree recognises something you can do.

In contrast, when we look at learning through the lens of the open badge, we see its application in the past tense. The learner has demonstrated their mastery of a skill by using it. In other words, the badge recognises something you have already done.

So the degrees vs badges debate isn’t really about the latter displacing the former. The emergence of badges is merely re-roasting the same old chestnut of whether degrees are necessary for the modern workplace.

In my previous blog post, Everyone is an SME, I argued that all the employees in your organisation have knowledge and skills to share, because everyone is an SME in something.

Sometimes this “something” is obvious because it’s a part of their job. For example, Sam the superannuation administrator is obviously an SME in unit switching, because he processes dozens of unit switches every day.

But sometimes the something isn’t so obvious, because we’re either too blind to see it, or – Heaven forbid – our colleagues have lives outside of the workplace.

Consider Martha, the tea lady. Obviously she’s an SME in the dispensation of hot beverages. That’s her job.

But dig a little deeper and you’ll discover that she’s also an SME in customer service and relationship management. That’s her job, too.

Oh, and she speaks fluent Polish and Russian.

May I also introduce you to Gavin, the IT grad. Gavin is proficient in several programming languages, as you would expect. In his spare time, he develops iPhone apps for fun.

You’re working on a mobile strategy, right?

Then there’s Li, the Business Development Manager. Li’s an expert in socratic selling and knows your product specs off by heart, but did you know she’s halfway through a Master of International Business degree?

She also recently emigrated from China – you know, that consumer market you want to break into.

My point is, when we seek subject matter expertise for a project, a forum, a working group, an advisory board, or merely to answer a question, we might not see the wood for the trees are in the way.

Does your organisation have a searchable personnel directory that captures everyone’s expertise? Their experiences? Their education? Their interests? The languages they speak?

I have really enjoyed following the recent argy bargy between Larry Sanger and Steve Wheeler. From a learning practitioner’s point of view, it raises issues of pedagogy, instructional design, and perhaps even epistemology.

Having said that, I think it all boils down to the novice-expert principle. As a novice, you don’t know what you don’t know. Thankfully, an expert (the teacher) can transmit the necessary knowledge to you quickly and efficiently. In eduspeak, you benefit from “scaffolding”.

Then, after you have acquired (yes – “acquired”) a foundational cognitive framework, I suggest a constructivist approach would be appropriate to expand and deepen your knowledge. In other words, now you know what you don’t know, you can do something about it.

My sector of practice is corporate rather than K-12, but I would assume that because the learners are children, their level of experience and prior knowledge is limited. Hence, having the basic concepts explained up front is a perfectly reasonable teaching strategy.

I wonder, though, whether conversation (online or otherwise) would indeed be a useful technique after the basics have been bedded down? Perhaps the last third or so of the class could be devoted to discourse facilitated by the teacher? Or assigned to participation in a district-wide online discussion forum? (Moderated, of course, by teachers and class nerds.)

Or – more likely – I’m exposing my ignorance of the logistics of managing a classroom.

My secondary point is that I am quite getting over the Twitterati’s tendency to devalue the role of the expert in education. Not only is the expert aware of the important facts, but they can also impart their meaning and context.

“The central argument of critical theory is that all knowledge, even the most scientific or ‘commonsensical,’ is historical and broadly political in nature. Critical theorists argue that knowledge is shaped by human interests of different kinds, rather than standing ‘objectively’ independent from these interests.”

As you can tell by that quote, Critical Theory is steeped in political science and social justice. However it all boils down to challenging any knowledge that presents itself as “certain, final, and beyond human interests or motivations” and is “considered so obviously commonsensical or natural that it is placed beyond criticism”.

In other words, Critical Theory is about myth busting.

Myths in e-learning

As the Canada Research Chair in E-Learning Practices at Thompson Rivers University, Dr Friesen applies the principles of Critical Theory to three e-learning myths:

I’m going to provide a brief overview of Friesen’s arguments, but then I’m going to do something either incredibly brave or incredibly stupid: point out where I don’t agree with the author – an academic heavyweight about a thousand times more credible than I.

But hey, Critical Theory is all about challenging what we’ve been told. Besides, by the end of this piece you’ll realise that I essentially agree with Dr Friesen. So please bear with me!

The knowledge economy

Friesen argues that the concept of a “knowledge economy” defines knowledge as a commodity. Rather than something that should be shared openly, it has market value and thus can be bought and sold.

Friesen recognises the social implications of such a philosophy: the emergence of a classist society in which knowledge workers (ie those who have the knowledge) succeed and prosper, while service workers (ie those who don’t have the knowledge and are relegated to manual labour) struggle and subsist.

My problem with Friesen’s argument is that it doesn’t appear to bust the myth of the knowledge economy. If anything, it reinforces its truth.

He describes the current state of affairs – the encroachment of technology on traditional educational artifacts; criticisms of the schooling system in its current form; the postindustrial shift from manufacturing to services; the widening gulf between rich and poor.

Then he advocates means by which we might overcome it – recognise the value of other forms of work that complement knowledge work; cultivate a range of skill sets that relate to those other forms of work; view knowledge as an instrument of democratisation rather than as a saleable commodity.

In other words, the knowledge economy is not a myth, and there is a danger that a large proportion of the population will become disaffected by it if we don’t do anything about it.

Anyone, anywhere, anytime

Friesen argues that the catchphrase “anyone, anywhere, anytime” promotes a privileged group of people (ie white males) as the universal representation of all e-learners. However, the digital divide dictates that “anyone” does not include people in disadvantaged communities; “anywhere” does not include nations outside the OECD; and “anytime”, I suppose, is redundant in light of the first two.

My problem with Friesen’s argument in this instance is that it takes the catchphrase out of context. In the corporate sector, for example, “anyone, anywhere, anytime” is certainly not a myth. It’s entirely plausible that all the employees of a particular company can access their e-learning resources from anywhere at anytime – and if they can’t, it’s an anomaly that the IT department needs to fix quick smart.

In this scenario, the experience of the population (ie the staff) is indeed universalised. Race, gender and income have absolutely nothing to do with it.

Besides, I’d imagine there are plenty of blokes in Lithuania (not to mention in trailer parks across the US) who would take umbrage to the assumption that all white males are rich and hyperconnected. Is that, ironically, a myth that critical theorists are guilty of perpetuating?

Technology drives educational change

Finally, Friesen argues that the myth “technology drives educational change” disempowers educators. It dictates that it is not they who drive the future of their own profession, but rather technological progress. Those who adopt new technologies will go forth and conquer, while those who resist will lag behind.

In contrast, Friesen maintains that technology is only one component of a complex system. As such, it is incapable of acting alone to initiate change, but rather must interact with people in their environment who will appropriate it accordingly.

My problem with Friesen’s argument this time around is that since the dawn of time, technologies from paper to blackboards, from computers to smartphones, have changed education. The way we teach and learn today is vastly different from how we did even a mere 20 years ago.

I agree that technology hasn’t driven those changes single handedly – after all, humans must be around to use it – but the flip side is that the changes would not have occurred if the technology was not introduced.

Regardless of how teachers and students respond to new technologies – whether they adopt or adapt them, hack them or mash them – their world will be different. Maybe we can’t predict how it will change, but we know that it will.

Shoot the messenger

When I finished reading Friesen’s paper, I couldn’t shake the nagging feeling that I really didn’t disagree with him. I know that sounds preposterous given my observations above, but it was so.

I couldn’t put my finger on it until I realised that Critical Theory isn’t really about busting myths after all; it’s about critiquing messages.

If I do a global find for “myth” in Friesen’s paper and replace it with “message”, suddenly our views align:

1. Knowledge is increasingly seen as a commodity in today’s workplace, and it’s leading us headlong into a social crisis.

2. Anyone, anywhere, anytime access to e-learning is feasible for a privileged few.

The questions I feel the critical theorist must ask are: Who propagates particular messages, and why do they do it? Under what circumstances are they true or false? What are the consequences of that truth or falsehood? What can or should we do about it?

You can bet your bottom dollar that those who pontificate about the knowledge economy are those who stand to profit from it handsomely.

Just like you may appreciate the CLO of a corporation using e-learning to facilitate anywhere, anytime access to knowledge for staff, but perhaps remain rather skeptical of some official from the UN grandstanding about it on behalf of the world’s poor.

Just like you may see through a salesman’s rhetoric about the next big thing, but rest assured if that doesn’t change your world, something else will.