The world of educational technology is clearly a favorite subject of mine, particularly as it applies to developing contexts. We have seen an almost litany of attempts, some seemingly well-intentioned and others not so much, to avoid the dirty work of teacher training in favor of automation or some sort of teach by numbers approach (see any number of posts we have done on this). We have seen some naive approaches to student surveillance and lax approaches to protecting the data that emerges as a result. All of this is, more or less, an effect of neoliberal policy designed, overtly or not, to pull back or disinvest from commitments to the public sector at the national level, in favor of privatization or whatever the market will bear. In some communities, this means a repurposing of education towards a more market orientation. In other communities, particularly developing nations, this is a process a bit more fraught with danger.

The aftershocks of this result in edtech entrepreneurism is that education favors automation, technology, or something that can be sold, rather than capacity building in professional development, community building, standards, innovative pedagogy, all bits that could exist outside the scope of technology (or conceivably outside a market orientation altogether). Hence why almost every edtech startup I see has little to no educational expertise, which is I suppose would be superfluous if one were hell bent on ‘disrupting the system.’ But as these are human beings, teachers and students alike, and not mere spreadsheet entries or algorithmic optimization considerations, I will opt less towards disrupting anything and augmenting something and this includes privacy and data protection.

Surveillance is one of those things that falls into this former category, one leaning towards technology and generally antiquated learning notions of behaviorism. Surveillance is needed, or so the rhetoric goes, to protect students (safety is always the Trojan Horse of these movements towards privatization) and to optimize their behavior (the surveillance produces data which can then be analyzed and optimally corrected). Image the surveillance data as being the less attractive cousin to learning analytics. I had written before about Korea doing something akin to this but they aren’t alone. This is fairly common, I would suspect, as most schools would have some sort of video surveillance and, I would suspect, fairly lax rules on how that data is managed. Korea walked back their surveillance (a bit) in the face of public outcry. China has followed a similar path but has focused on classroom surveillance and, a bit surprisingly, have seen a similar level of pushback from the larger community.

“If classrooms are under surveillance at all times, instruction will definitely be influenced by outside factors and the opinions of whoever is watching,” said Xiong Bingqi, vice president of the 21st Century Education Research Institute, an influential Chinese think tank, who called the practice a violation of students’ rights and a threat to academic freedom.

After a critical article on the subject recently in The Beijing News, a prominent newspaper, several schools announced they were ending the broadcasts. But thousands of others chose to remain online and continue to draw a daily audience of cyber class monitors eager to report daydreaming students and lax teachers.

So some good and some bad. Some pushback (and encouraging exercises of journalism) and some have continued to push on, even drawing larger and larger audiences. Beyond the privacy issues here are the automation and behaviorism ones. Surveillance leading to behavioral change. More paying attention. More activity from the teacher. More. More. More. Pedagogy and learning outcomes shift as a result. Lazy shortcuts to improving education.

But we mustn’t fool ourselves into thinking this is a problem only in particular contexts, where ideas of freedom and privacy are structured differently. The Electronic Frontier Foundation’s warning to this effect should signal that this level of surveillance (and lack of data protection) is a very prevalent issue globally, and in particular the Western context where this type of surveillance is obfuscated through a myriad of apps and school-issued devices.

“They are collecting and storing data to be used against my child in the future, creating a profile before he can intellectually understand the consequences of his searches and digital behavior."

Neoliberalism and legality aside, this is, ultimately, personal data. Just as teachers and school systems are bound to the protection of the child, so are they bound to the protection of the digital self of that child. Beyond that, this is a fairly myopic view of what education is. We are building autonomous, accountable, creative, and critical teachers and students here. Everything else is instrumentation.

If you want to learn more about data protection, I might suggest DLA Piper’s Data Protection Laws Around the World site. Teachers, find out what your country’s data laws are and bolster them with your own practice. School districts, shame on you if you don’t know better.

I am typing this out from Jomo Kenyatta Airport in Nairobi reflecting on the past week spent with the good people of UN Habitat, specifically those associated with the CityRAP tool. The CityRAP tool trains city managers and municipal technicians in small to intermediate sized cities in sub-Saharan Africa (SSA) to understand and plan actions aimed at reducing risk and building resilience through the elaboration of a City Resilience Action Plan.

A few caveats at the onset here. This reads a bit more like an academic piece which it largely is. It is drawn from something larger I wrote a bit ago for another paper. It might also read like an attack on the SDGs, which is not my point. The point here is that the SDGs have generated some incredible results and I sincerely support them, but we must be mindful of what is being mobilised in our pursuit of them. My focus is education and I suggest that the provisions of the SDGs related specifically to that field suggest particular scaled interventions (or at least make those approaches particularly attractive). Scale exacts pressure on particular types of education.

As part of my association with the Centre for Research in Digital Education at the University of Edinburgh (a version of this post appears there as well), I recently traveled with colleagues to deliver a three day workshop on digital education for Syrian academics who have been displaced by the conflict. The University has worked for a long time with the Council for At-Risk Academics (CARA), a great organisation providing urgently-needed help to academics in immediate danger, those forced into exile, and many who choose to work on in their home countries despite serious risks.

We seem to have endless ideas on how to use Information and Communication Technologies for Development (ICT4D). From job creation to women’s empowerment to civic participation, a number of ICT4D interventions have been developed and implemented over the years. Common question asked in my work is “what type of technology that might have biggest impact in our society in the coming years?”. As we have learned, ICTs in itself aren’t sufficient. While factors contributing to the success of ICT4D have become apparent, and many have written about them, I feel there's still a need to highlight some of them.

We have been some of the most vocal critics of Bridge International Academies (BIA), largely because most investigations and evaluations of their edtech impact to improve schooling in sub-Saharan Africa have been less than spectacular (many would say the impact is non-existent). So imagine our surprise to see Wayan Vota's latest ICTworks™ post highlighting the successes of BIA in Liberia.

We need to make women in innovation more visible, and correct the gender imbalance in the stories we tell. We need to tell more stories about the women working at the top of humanitarian innovation, and so today I sat down with Tanya Accone, Senior Advisor at UNICEF Innovation, to tell the story of a woman working at the top of a very visible humanitarian innovation team for a very visible humanitarian agency.

We do a lot of work on open learning as well and it was clear there was tension between these open educational platforms (like Coursera, edX, etc.) and their use in local contexts, particularly in emerging economies. There is tension there. Open educational technologies are too often framed as a transparent instrument for educational export, keeping (specifically Western or Global North) curricula, pedagogy, and educational values intact whilst they are broadcast to a global population in deficit.

I remember when I first started hearing the buzz about bots. My first thought? 'Here we go again...' - a reaction to the endless cycles of hype followed by business-as-usual that typifies the digital sector. However, over the past few months I've had the opportunity to design a few 'bots 4 good', and I'd like to share what I've learned: how they work, what they could be useful for, and where to start if you'd like to get one. I believe that done well, they could be really useful add-ons to your digital strategy as they provide a rich 'in-between' space for mobile users who aren't fully digitally literate.

Last week, I was at TICTeC 2018 where researchers, activists and practitioners discussed the impact of civic technology, or civic tech. This blogpost summarises the discussion of Two heads are better than one: working with governments.