Are libraries becoming nothing more than data cash cows for the private sector?

Last week the Department for Culture, Media and Sport published its UK Digital Strategy, to much fanfare and eager anticipation amongst those of us with an internet in digital inclusion and how we advance it. The report made mention of libraries as crucial elements in the efforts to advance digital inclusion (yay!), but not quite in the way many of us advocating for public library services would want (boo!). And as for how this strategy squares with the Investigatory Powers Act, well…we’ll come to that. But let’s start with the role of public libraries…

In section two of the report, under the heading “How libraries deliver improved digital access and literacy”, great play is made of the role of libraries. They “have an important role”, they “tackle the barrier of access” and they make “significant inroads towards tackling the combined barriers of skills, confidence and motivation by offering skills training”. All of these things are true, however this role is not the preserve of libraries and library staff alone. As the report makes clear:

“Public libraries work in partnership with charities and private partners such as Halifax, BT, and Barclays to improve the lives of some of the most socially and digitally excluded people.”

They do work in partnership with these private partners and, from the private partners’ points of view, there is a big win for them in doing so. As I’ve pointed out before, the way such skills sessions are delivered is a particular bonus for companies such as Barclays. By guiding the members of the public towards using tools that are, shall we say, less then privacy friendly, it just so happens that they gain a certain advantage in terms of marketing their products. Something, of course, that would not be encouraged had library workers been providing such support (were they to receive the proper funding in which to do so).

Indeed, it seems to me that rather than being places where people can get online and gain the basic digital skills our society increasingly demands, they are becoming a gateway to massive data collection for corporations eager for more and more data to drive their marketing campaigns and, ultimately, to drive growing profits. Let’s make no mistake here, if libraries were properly funded, proper training was provided and the service was delivered according to the ethical principles by which the professional body for librarians guides its members, digital skills would be delivered in an entirely different way.

For example, there is no known reason as to why search engines such as Google or Bing are advocated for over and above alternatives such as DuckDuckGo. They work in a similar way, one is not somehow easier to learn than the other. There is one fundamental difference however. Google is an extremely successful data harvester. Create a Google account, login to your Google Chrome browser, use your Google Mail account and voila, huge amounts of data is being gathered about your online activities. And if you are Barclays, providing members of the public with guidance on using the internet and you just so happen to have additional guidance on the Barclays website, well…there’s certainly an opportunity there for free direct marketing to Gmail accounts. With DuckDuckGo, there is no data. No trail of your search history. You simply search, find what you want and no data is left behind.

As someone who is concerned about digital inclusion, I can only conclude that the current strategy amounts to not getting people online for the benefits it brings to the individuals, but getting more people online to create benefits for corporations and the government. The more people that are online, the more data is created and, ultimately, the more profit is created. Getting people online is good for business. It enables a marketing strategy that is not possible if people remain offline. For little outlay, large corporations like Barclays can get people online, teach them how to expose their data, then take advantage of this for profit and business growth. Let’s not kid ourselves into believing that any corporation is seeking to tackle digital inclusion because, for example, it increases democratic engagement or accrues any other benefit. Likewise, given the Investigatory Powers Act and the mass surveillance it permits, the more online the better the government are able to monitor the people. If you are not online, you are a black hole of data. Get connected, and you become a useful source of information. And what of the Investigatory Powers Act…

On scanning through the report it’s interesting to note that there is not a single mention of encryption technologies. Not one. There is even a section in the report called “A safe and secure cyberspace – making the UK the safest place in the world to live and work online”, it doesn’t mention encryption once. Why? It is the single most important tool available to ensure individual safety and security online. So why isn’t it even mentioned? Because the Investigatory Powers Act is explicitly hostile to it. It wants to discourage encryption technologies wherever possible. Because encryption technologies obscure data from the state. And it doesn’t want your data obscured, because it might be useful for intelligence purposes (it won’t…). Not only is it not welcome for the government, it is also not welcome for corporations. Use encryption technologies and you are obscuring data from them too. Data that they could use to sell you products, to generate sales, to drive profit. Encryption is bad for business when it is used in a way that limits the harvesting of data used for profit. (But good for business when it enables secure transactions they benefit from of course.) As Paul Bernal notes about the strategy document in terms of encryption and safety online:

@ijclark It’s desperate stuff, that anyone who knows anything about the issues will see through in a second. Designed to fool the public.

Which takes us back to where we are in terms of digital inclusion. It seems to me that the overall digital inclusion strategy is not one driven by the needs of the public (if so, why isn’t individual privacy at the forefront of the strategy when privacy is a growing concern?), but driven by the needs of government to get people online for the cost benefits and surveillance benefits it brings, and the needs of corporations that need data to be freely exchanged so that it can be utilised and monetised to drive profit. The needs of the general public are secondary, the prime motivator (for policy makers) is the creation of data. If our libraries were properly funded, if the people working in them were properly trained, that data would not be created on the scale it is when the banks (the banks!!) are providing that kind of support. Which of course, should not surprise us. The weakening of public services is exactly designed to lead to a full consumerist society.

How we prevent this is a more difficult question to tackle. The causes are deeply-rooted in an ideology hostile to public services and strongly in favour of shifting people from being citizens to being consumers. The digital strategy simply makes more explicit the extent to which the government (and corporate Britain) seeks to turn us into consumers driving profits, rather than citizens engaging in the democratic process and using access to information purely for our own benefit. With the sidelining of privacy and individual freedoms in the drive towards a mass surveillance state and in the push towards “digital inclusion”, it’s clear how close that goal is to being realised.

Today the Science and Technology Committee published their report on the “digital skills crisis” which concluded that “up to 12.6 million of the adult UK population lack basic digital skills” and 5.8m have “never used the internet at all” (you can view the full report here). In setting out the report, the Committee makes the following claim:

Digital exclusion has no place in 21st Century Britain. While the Government is to be commended for the actions taken so far to tackle aspects of the digital skills crisis, stubborn digital exclusion and systemic problems with digital education and training need to be addressed as a matter of urgency in the Government’s forthcoming Digital Strategy. In this report, we address the key areas which we believe the Digital Strategy must deliver to achieve the step change necessary to halt the digital skills crisis and bring an end to digital exclusion once and for all.

Which all sounds very laudable, unfortunately the goal of ending digital exclusion is virtually impossible in a capitalist society – it’s permanent. There will always be a large proportion of the population that are digital excluded, no matter what effort we make to eradicate it. Indeed, the progress of the Investigatory Powers Bill rather underlines the extent to which digital exclusion is being entrenched, not eradicated.

Digital skills have no single definition, but have been variously described to include a general ability to use existing computers and digital devices to access digital services, “digital authoring skills” such as coding and software engineering, and the ability to critically evaluate media and to make informed choices about content and information—“to navigate knowingly through the negative and positive elements of online activity and make informed choices about the content and services they use”.

I’ve argued before that corporate surveillance is permanent in a capitalist society. Corporations rely on the collection of personal data to deliver profits. They make their products “free” to use, then accrue profit through the [mis-]use of personal data. In a capitalist society, individuals will always choose that which is free over that which is not (particularly the less privileged who have no choice whatsoever). Factor into this the impending Investigatory Powers Bill and we have a further undermining of any individual’s efforts to protect personal data, because private companies will store that personal data which may then be made available to the state upon request (and, incidentally, if it is your data, it will be illegal for you to be told such action has taken place).

What the situation creates is one where only a small minority of privileged individuals will be able to protect their personal data effectively (and even then, with limitations). The vast majority will not. The vast majority will not have the social or economic capital with which to make the choice to protect their personal data. They face permanently remaining on the wrong side in terms of digital inclusion, because the infrastructure is in place to prevent them from ever bridging that gap. If we are to be serious about tackling digital exclusion, then we have to take a much wider look at the protection of personal data and what that entails.

In one recent study, John Penney found that, following Edward Snowden’s disclosures about mass surveillance, there had been…

“…a 20 percent decline in page views on Wikipedia articles related to terrorism, including those that mentioned ‘al Qaeda,’ ‘car bomb’ or ‘Taliban.'”

Penney went on to conclude that:

“If people are spooked or deterred from learning about important policy matters like terrorism and national security, this is a real threat to proper democratic debate.”

This is not even a controversial point at odds with established thinking on the effects of surveillance. In 1967, for example, the President’s Commission on Law Enforcement and Administration of Justice concluded that:

“In a democratic society privacy of communication is essential if citizens are to think and act creatively and constructively. Fear or suspicion that one’s speech is being monitored by a stranger, even without the reality of such activity, can have a seriously inhibiting effect upon the willingness to voice critical and constructive ideas.”

Online privacy cannot be viewed purely on narrow terms when it comes to digital exclusion. The inability to protect one’s privacy online has serious ramifications in terms of democratic engagement. If people are not able to seek out information or to communicate with each other in private, then they will be effectively digitally excluded. And, again, a lack of social or economic capital will ensure that a significant proportion of the population always will be digitally excluded. We may reduce the numbers of people that are digital excluded, but we can never eradicate it. The only way to do so would be to ensure all online tools and methods of communication are fully encrypted, but this is impossible in a corporatised internet where data = profit. Equally, it is not possible when you have laws going through parliament that are hostile to digital privacy.

Digital exclusion may well have “no place in 21st century Britain”. Unfortunately, a combination of government policy and prevailing economic doctrine will ensure that not only is digital exclusion a reality for those without privilege in the 21st century, it will remain so for a long time to come.

Back in 2004, the Chartered Institute for Library and Information Professionals (CILIP) published their Ethical principles for library and information professionals. Amongst many of its principles was the commitment to “the defence, and the advancement, of access to information, ideas and works of the imagination” (CILIP 2012). This followed its endorsement, in 2000, of the Council of Europe’s declaration that users should be able to “decide for themselves what they should, or should not, access” and those providing the service must “respect the privacy of users and treat knowledge of what they have accessed or wish to access as confidential” (Council of Europe 2000). Privacy of the information the user has accessed is paramount and the profession must do what it can to ensure that such access is not impeded. But do our stated ethical beliefs and our desire to commit to access to information and ideas stand up to scrutiny in the modern era?

The disclosures by Edward Snowden in 2013 have had a significant impact upon the way we view the internet. Rather than a tool that facilitates freedom of expression and creation, it has increasingly emerged as a tool enabling extensive state surveillance and the control and management of individuals. It was clear, as if there was ever any doubt, that our activities online were monitored by governments, including data collected by large corporations (Greenwald et al, 2013; Leopold, 2014; MacAskill, 2013).

The history of surveillance

Discussions around surveillance are often linked to Jeremy Bentham’s Panopticon prison design. Bentham was influence by his brother on his return from Russia to design a prison with cells arranged around an outer wall with an observation tower in the centre (Foucault 1977). The design enabled the guards to be able to look into any of the cells at any one time, whilst the prisoners were unable to view the guards. Michel Foucault, in his work Discipline and Punish argued that the consequence of Bentham’s Panopticon was the “automatic functioning of power” – that is that the prisoners were disciplined without the need for any action to be taken against them (Foucault 1977). There was no need for the guards to exercise power as the prisoners were managed by the belief that they could be seen. This “management” aspect of surveillance is also picked up by David Lyon, a respected figure in the surveillance studies community, who defines surveillance as the “focused, systematic and routine attention to personal details for purposes of influence, management, protection or direction” (Lyon 2007).

Of course, state surveillance is nothing new. Governments have always sought to engage in varying degrees of surveillance in order to minimise “threats”. In 1913 for example, the UK government ordered that photographs be taken of all known suffragette prisoners in order to limit their activities upon release (Casciani 2003). In 1967, the President of the United States ordered for the surveillance of domestic dissident groups in the wake of the assassination of Martin Luther King Jr (Richards 2013). However, surveillance hasn’t been pursued as a strategy by the state without hesitation. In the same year as the President ordered surveillance of domestic dissident groups following the assassination of Martin Luther King Jr, the President’s Commission on Law Enforcement and Administration of Justice warned:

“In a democratic society privacy of communication is essential if citizens are to think and act creatively and constructively. Fear or suspicion that one’s speech is being monitored by a stranger, even without the reality of such activity, can have a seriously inhibiting effect upon the willingness to voice critical and constructive ideas.” (President’s Commission on Law Enforcement and Administration of Justice 1967)

As Lyon and Foucault had observed, the use of surveillance as a tool manages and influences individuals. This may influence individuals not to threaten the state or fellow citizens, or it might inhibit their willingness to explore ideas and act as critical members of society.

Surveillance and democracy

The ability to seek out information freely and without impediment is crucial in a democratic society and as librarians we have a key role in providing that space to enable intellectual enquiry. As Zygmunt Bauman, another key figure in the study of surveillance, observes:

“Democracy expresses itself in a continuous and relentless critique of institutions; democracy is an anarchic, disruptive element inside the political system: essentially, a force for dissent and change. One can best recognize a democratic society by its constant complaints that it is not democratic enough.” (2001: 55)

To enable those complaints and critiques that mark out a society as democratic, individuals must be given the space to explore and seek out new ideas. If that space for intellectual inquiry is not available, then individuals are not able to engage in the democratic process. As librarians with an ethical duty to ensure individuals can access information privately, accepting the right to do so confidentially, we clearly have a duty to provide such a space.

Online encryption technologies are crucial in ensuring such a space is available for individuals to both seek out information and to communicate with others. Unfortunately, much of the political establishment in the United States and the UK are hostile to the use of such technologies. Shortly after the terrorist attacks in Paris last year, the head of the CIA, John Brennan, argued that this served as a warning about the danger of allowing encryption technologies to exist (Brennan et al. 2015). The fact that there was no evidence of the use of such technologies didn’t appear to be factored into his criticisms (O’Neill 2016). Furthermore, there have been those that have argued that Snowden’s revelations have heightened awareness of online encryption technologies amongst terrorists. Once more, however, experts on the so-called “Dark Web” have found that no such evidence of an upsurge in usage of such technologies has occurred (Flashpoint Partners 2014).

The UK government appears to be particularly keen on limiting the availability of private spaces online. Both the Karma Police and Tempora programs have led to the mass collection of data (Shubber 2013; Gallagher 2015). David Cameron has been particularly hostile to the use of encryption technologies, arguing that there should be no means of communication that the government cannot access. Furthermore, the requirement of the Draft Investigatory Powers Bill underline the extent to which people’s online activity will be exposed to interrogation by the state. As for encryption technologies, the Prime Minister has made it clear that, as far as he is concerned, no communications between individuals should be conducted in private (Temperton 2015).

This raises important questions for us. If we are to accept that the privacy of our users must be respected and that “knowledge of what they have accessed or wish to access” should be treated confidentially, how does this sit with the desire of the state to obtain information about the information we are seeking out and our communications? Does the collection of such data in our libraries, on our library computers, not contravene our commitment to maintain the privacy of our users? If so, how do we tackle this?

As with all other aspects of technology, one cannot expect individuals to both know how to use encryption technologies and understand which tools are appropriate to use. Encryption technologies are not necessarily easy to use and without the knowledge of which tools to use, individuals will not be in a position to protect themselves. Those with social (a network of skilled support), economic (the means to purchase equipment) and cultural capital (ability to invest time to improve skills) are at significant advantage. Given that this enables a degree of democratic engagement (to freely explore ideas and critique the status quo) that those without cannot achieve, there are clearly implications for the state of our democracy. If only those with capital are able to explore, to engage and to critique, then the democratic system will be skewed in their favour. Those that do not have the ability to engage and critique democracy freely and without impediment will therefore be excluded from the democratic process.

Digital inclusion

The current state of the digital divide indicates the extent to which support is needed to ensure that citizens can seek and obtain information freely and securely, without fear of a breach of their intellectual privacy. The Office for National Statistics (ONS), for example, identifies the extent to which skills and cost are impediments to people getting online. According to the latest data, 31% of those that have never been online say the reason they have never been online is a lack of skills, whereas 14% say cost of equipment and 12% access costs are why they have never been online. It is particularly stark for the most disadvantaged in society (Office for National Statistics 2015). Of those that lack the basic skills, 69% are from the C2DE social group (BBC Learning 2014).

This raises serious concerns in terms of democracy. Those with the skills to use the internet effectively, to utilise more advanced tools such as encryption technologies and to protect their internet activity from state or corporate eyes, will be better equipped to “think and act constructively”, to gather the information they require to critique institutions and the political system. Those who lack that ability, who don’t have the skills to use the internet to the fullest extent, will find themselves exposed to state and corporate surveillance, limiting their ability to conduct intellectual enquiry and putting themselves at risk. Given the ethical position taken by CILIP and IFLA, it is incumbent on us as information professionals to level that playing field and ensure that it is not only the most privileged in our society that can securely and privately seek out information that challenges the status quo.

Librarians and surveillance

In the United States there have already been moves to challenge this inequity. The efforts of Alison Macrina and the Library Freedom Project in particular have been key to tackling the inequality of online privacy (Burns 2015). Training members of the public and helping to establish the use of tools such as the Tor Browser (a browser that provides the user with a high degree of privacy when online) in a public library, Macrina’s Library Freedom Project has highlighted the need for library and information professionals to help ensure that citizens can access information confidentially (Doyle-Burr 2015). These efforts have been picked up by both the American Library Association (ALA) and the International Federation of Library Associations (IFLA) who have both engaged in greater efforts to raise awareness of online privacy and encryption technologies (ALA Office for Intellectual Freedom 2015; IFLA 2015).

To date no comparable activities have taken place in UK libraries to protect intellectual privacy. Indeed, in terms of public libraries, recent moves to encourage private sector support to develop online skills have made such efforts less likely to take place. The Barclays Digital Eagles scheme, for example, has a heavy emphasis on encouraging individuals to use commercial products such as Google and other popular online tools that do not offer the privacy and security of tools such as Tor (for browsing), DuckDuckGo (for searching) and Signal or an email service that uses the OpenPGP (Pretty Good Privacy) standard (communications) (Barclays 2015). The side effect of encouraging such software is that signing individuals up for a Google account then directing them to the Barclays website leads to the increased likelihood of being served Barclays’ ads direct to the user’s inbox (Google 2015).

The question is, is it possible for UK libraries and librarians to replicate the work of the Library Freedom Project? It is undoubtedly more difficult in the UK to pursue such actions. Unlike the United States, we do not have a strong cultural sense of the importance of freedom of expression. Whereas the US enshrines such values in its constitution by way of the First Amendment, there is no such equivalent in the UK. Whilst the First Amendment does not open doors unconditionally, it is a lever by which pressure can be applied – and one that librarians have successfully utilised in the past (Pollack 2006). Furthermore, whilst the Federal government provides funding, it doesn’t superintend library services as the British government does (CILIP 2013). A government hostile to privacy technologies (as stated by the Prime Minister) is unlikely to allow a service that it oversees to provide online security training that contravenes its wish to limit the use of such technologies. However, this should not limit the use of tools such as DuckDuckGo, which provides greater security of an individual’s internet search history, or the use of ad blockers and other such privacy enhancing tools. Furthermore, whilst libraries that are superintended by the government may find it difficult to teach the necessary skills to protect intellectual privacy, those that aren’t subject to state oversight may find there is scope to support the intellectual privacy of individuals. Libraries independent of state oversight may find that there is potential to raise awareness of encryption technologies and provide support in their use.

Although the encouragement of the use of tools that protect intellectual privacy will not be easy in an environment where there is outright hostility from the state, we do need to recognise that the protection of intellectual privacy is fundamental to the ethical framework in which we work. It is, therefore, essential to seek and identify ways in which we can hold true to our commitment to ensure the intellectual privacy of our users. There are steps that we can take now to ensure this, and there are others that will need us as a profession to come together to find ways to make happen. What we can surely all agree on, is that individuals must be free to seek out information, to inform and educate themselves, freely and without impediment. The challenge for us is to ensure that we enable this right to the fullest possible extent.

CILIP, 2013. Short Briefing on the Public Libraries and Museums Act 1964. , (January), p.2013. Available at: http://www.cilip.org.uk/sites/default/files/documents/Briefing on Public Libraries and Museums Act 1964.pdf.

Council of Europe, 2000. New information technologies: public access and freedom of expression in cultural institutions, Available at: http://www.coe.int/t/dg4/cultureheritage/culture/Resources/DECS_CULT_NTI_libex(2000)2_EN.pdf.

Flashpoint Partners, 2014. Measuring the Impact of the Snowden Leaks on the Use of Encryption by Online Jihadists, Available at: https://fpjintel.com/portal/assets/File/Flashpoint_Jihadi_Encryption_Software_Sept2014.pdf.

Foucault, M., 1977. Discipline and punish : the birth of the prison, New York: Pantheon Books.

Gallagher, R., 2015. From Radio to Porn, British Spies Track Web Users’ Online Identities. The Intercept. Available at: https://theintercept.com/2015/09/25/gchq-radio-porn-spies-track-web-users-online-identities/ [Accessed November 8, 2015].

Librarians have a key role to playing in terms of digital inclusion and protecting intellectual privacy. [Image c/o Duca di Spinaci on Flickr – CC-BY-NC license]

Towards the end of last year, I was privileged to be invited to talk at CILIP’s Multimedia Information and Technology (MmIT) Group AGM about digital inclusion as a representative of the Radical Librarians Collective (see the presentation below – which includes a list of recommended reading!). The invitation was well timed in terms of coming up with a focus for my talk as I have spent the best part of 5 months working on a journal article for the Journal of Radical Librarianship on the digital divide (which, pending peer review, will hopefully be published in the early part of this year). Specifically, I’ve been interested in looking at digital inclusion from a slightly different angle, that of the divide in terms of state and corporate surveillance.

As followers of this blog will know, I’ve been talking about surveillance and the Snowden revelations for some time now. Concerned about the gathering of information about us, whilst the state seeks to limit the amount of information we obtain about them, I’ve mainly been focused on the impact this has in terms of our democratic processes. However, since the emergence of the Library Freedom Project (founded by the awesome Alison Macrina), I’ve been increasingly interested in the role that libraries and librarianship has to play in this area. It seems to me, that the disclosures have to expand the terms by which we define what the digital divide is. Whilst there has always been a focus on access, and on skills, there must be greater attention on what people actually do online and, furthermore, the extent to which individuals are able to act freely in terms of seeking information.

Being able to seek out information that offers alternatives to the status quo (indeed, not just “offers” but challenges) is vital in a democratic society. Without the ability to seek out and understand alternatives, it is hard to accept that our society can possibly be described as “democratic”. What is clear from Snowden’s disclosures is that the ability to seek out information and communicate with others whilst ensuring your intellectual privacy is increasingly difficult. Difficult unless you have the skills and knowledge with which to defend your intellectual privacy.

I tend to think that I am fairly skilled in terms of using the internet. I can seek out information quickly and efficiently, I can provide assistance for others, I am fairly innovative in the ways in which I use certain online services. What I lack, however, is the skills necessary to really ensure my intellectual privacy, to defend myself against state or corporate surveillance. I have some skills, I have some basic knowledge, but I don’t know how to protect myself fully. And yet I consider myself reasonably skilled. What about those that have difficulties in using the internet in a basic way? What about those that struggle to do the things that I take for granted? Aren’t they even more exposed to state and corporate surveillance? Isn’t their intellectual privacy even more under threat? Surveillance tends to affect the most disadvantaged to the greatest extent, is intellectual privacy something only for the privileged?

I don’t want to get into this even further here (wait for the longer version!), but I do think there are issues here about the nature of the digital divide and how we should view digital inclusion post-Snowden. There was a time when it was considered fanciful that librarians could even consider to provide the sort of skills that the state may see as a threat to the status quo. However, the efforts by the Library Freedom Project in the United States underlines that this is no longer the case. If librarians in the United States, the home of the NSA, can help people defend their intellectual privacy, why can’t we do the same in the United Kingdom? I’m not suggesting that we can collectively as a profession start setting up Tor nodes in libraries or teaching people how to use encryption technologies, but we need to have the debate about how we ensure the intellectual privacy of everyone in our society, not just the privileged few.