The wall is one of the earliest technologies developed to counter surveillance. As a form of defense, materials piled in everything from a heap to intricate stonework would protect from the wind, the wild, and the watching eyes of one’s neighbors and enemies. Privacy has always been a part of what makes us feel secure. Jay Appleton’s concept of “prospect and refuge” places our comfort zone in a place where we can see others and remain unobserved, like caves in cliff-faces. When geography fails us, we build walls to suit our needs for privacy.

Walls are no longer enough to make us feel safe, or to provide us with viable shelter. In fact, we see walls as a sign of enclosure, of being walled in. The encroachment of connection brings with it threats seemingly unattached to space and time. Disconnection can seem like an absurd and regressive tactic. Engagement with media creates a compulsion to share and participate, and sociability is built into the technologies we develop today. Prescriptive use of devices points to a certain effectiveness for specific interests – consumers are discouraged from fixing or tinkering with their things, drones and cameras have facial recognition and camera stabilization, and we are never totally certain what our devices are doing without our knowledge (unless we are experts). Bruno Latour describes prescription as embedded values or moral and ethical dimensions of the things we use. Who prescribes these values? They are engineered by their creators, who have in mind a very specific type of use. We are trapped in their “walled garden.”

In fact, all technologies are first geared towards specific sorts of users, and only afterwards potentially have a consideration of non-users, or people who choose to “misuse” technology. The drone is engineered for the operator, who can then approach the windows of an apartment located in a high refugee, who has little recourse except to shut the blinds. There is little opportunity for the non-users to choose NOT to be recognized by a user’s Google Glass, or to be tracked by algorithmic recommendation trackers, or to be experimented on by the social network. There are few settings for an iPhone user to decide how and when they will be tracked (and presumably, these options mean nothing to a more powerful user who manipulates the device against its owner).

As a user, one is either in or out – connected or disconnected, engaged or withdrawn. We’re divided by imaginary binaries about being on the grid versus off of it entirely. Because no one wants to be out, terms of service are quickly signed, not because we are unconcerned about what they say, but we know our concern doesn’t always translate into effective agency.

But instead of another false binary of use/non-use, we should consider our relationship with technology as a spectrum of motivations (or tactics) and practices on how we use devices to our own ends.

Ben Light provides an alternative to the user/non-user binary in the form of disconnective practices. Thinking similar to how Michel de Certeau might characterize practices and tactics of individuals against the strategies of systems, users employ a range of behaviors in how they chose to participate and engage with social media. Likewise, using technology to create a new sort of wall against the outside world does not make us non-users, rejectors, refusers or even “Luddites,” but instead allows for a type of creative “misuse” that provides greater personal agency – an ability to dictate the terms of our involvement with others.

Much in the way a wall has a gate or a window, our privacy can be secured through the use of an apparatus. This is a unique technology that can counter surveillance (like a wall), but does not put us outside the system of connections that characterizes our networked lives. Doors allow us to escape from walled in places without destroying them altogether. It isn’t necessary to airgap everything we own and retreat to the woods – instead, materializations based around practices and communities of misuse provide us with some potential alternatives to just accepting the vulnerability in connection.

Hacking and DIY culture provide a ready example of misusers. Unafraid of violating a TOS or a warranty, those practices take away some of the control from engineers and technologists, and place it in the hands of the userbase. Unlocking a phone, for instance, runs contrary to everything the company who designed it is working to achieve: a careful relationship with carriers, a captive market and an OS without vulnerabilities. The hackathon, hacklabs and makerspaces are places where these ideas are workshopped, developed and produced. Other communities that foster disconnection (getaway camps or “simple living” groups) can encourage a sort of minimalist thinking about technology, but only offer ways we can limit our own use – they don’t give us solutions for countering the use of others.

Instead, we have a rising popularity of alternatives like DuckDuckGo, or at least the idea of alternatives like the social network Ello. These recognize the use of the engineer as potentially conflicting with the use of the userbase. Often their motivation is to surveil and monetize the activity of the end-user. To do this, they implement a series of prescriptive uses for their work.

Acknowledging that these systems have at least two forms of use is the first step. The second is to explore and encourage disconnective practices, potentially engaging in proscribed use. At the very least, there should be options. If we want to engage with a technology, we should have some way to dictate our terms – a Terms of Service the users can give the engineers and designers. In this way, our participation would be a negotiated activity, rather than an all or nothing encounter with a system that puts its own motivations first.

In the case of a drone, CyborgUnplug disconnects intrusive surveillance devices from using your WiFi. Similarly, glasshole.sh is a script that disconnects Google Glass users from local networks. Both of these work under the presumption that users are often at odds with each other. We are each using technology to different ends and with different motivations, and every wall has two sides. It is helpful if we can do more than simply draw the curtains on the outside. Artist Adam Harvey has similar thoughts about surveillance, and uses fashion and makeup to confound surveillance technology. Stealth Wear is anti-thermal imaging clothes, including a hijab, a burqa, and a hoodie, evoking both Trayvon Martin and a decade of Islamophobic sentiment through the eyes of a what could easily be a UAV’s camera. CV Dazzle is a way of beating facial recognition software. These appear to be ways to hide from technology and objects surveilling us, but this isn’t exactly right: these objects have their operators, their installers. Likewise, algorithms have their programmers, and even if a sort of instrumentalist rationality lies behind the configuration of a system, those ideas are embedded in the objects because someone has decided to use it to achieve their own ends. The logic of the system which governs objects used to surveil, oppress and marginalize is closely tied to the culture and the politics of engineers and well-behaved users.

What we must do then, is recognize a capacity for misuse. When Ned Ludd broke his stocking frames in 1779, he apocryphally used a hammer. A hammer is Heidegger’s example for a ready-to-hand tool. Breaking the hammer transforms it into something which must be repaired or reconciled. The Luddites demanded that the looms, stocking frames and other textile equipment of the 18th century be reconciled with the social conditions that they encouraged. Likewise, by misusing or interfering with the way others use technology, we break that thing and create a necessity to reconcile how it has been used. Can we break Facebook? Could we break Amazon? These forms of seemingly innocuous surveillance have inspired critiques, and some people have left the social network or deactivated their Prime accounts.

What alternatives exist? Langdon Winner suggests an “epistemological Luddism” where we refuse to repair the artifacts of whatever politics we want to dismantle. An army of engineers stand between us and the ruination of these systems. What comes next then is a critical interaction that reveals the embedded intentions of medium, the way a lobster trap allows the creature to enter but not to escape. What functionality is working for the engineer-as-user, against the end-user? How can we confront the surveillance and the sousveillance all around us? Are disconnective practices feasible? Assuming they are, we will need a hammer. We need to become makers who are not building instruments for the logic of engineers, but for users and potential abusers of potentially oppressive systems.

Religious communities recognize the social conditions engendered by certain technologies and how they interfere with the values of their community. This is what creates the need for Venishmartem to produce filters, modified laptops and cellphones for Haredi and Orthodox Jews. The limited functionality suits their use. Likewise, the Dolmio Pepper Hacker “disables WiFi, TVs, and mobile devices for half an hour while it’s seasoning your food,” allowing families who would otherwise feel the social conditions created by a household saturated with media use interfere with their own personal values. The walls are not enough – instead there has to be a creative engagement with what flows around them, a type of playful misuse. We bring our metaphorical hammers to the threshing machines which would otherwise sort us out, wheat and chaff, to the ends of its operator. Whether we are academics or activists, everyone with a critical orientation can recognize a need to resist those homogenizing forces, creeping into every aspect of our lives.

Audre Lorde famously made the argument that “the master’s tools will never dismantle the master’s house.” But we may be able to exorcise the embedded politics of a tool if we break it. Misusing a tool may help us discover alternative practices. We may discover clever “life- hacks” that don’t boost our efficiency but improve our personal fulfillment and the restoration of our agency robbed by prescripted use.

It is a somewhat trite observation of many media scholars that we do not only use technology, but technology uses us. The expectations of engineers are fulfilled by a user-base which finds so much convenience and offers its complacency in return. It should not be a radical notion to suggest we study how technology uses us, and if necessary, decide what we can do to interfere with the use of others. This is the advantage of creative misuse.