Pistorius represents a high profile case in the debate surrounding whether prostheses provide an unfair advantage in athletics. In 2007, the International Association of Athletics Federations (IAAF) banned technological devices in sports that unfairly disadvantaged athletes without such aids. In Pistorius’ case, the IAAF claimed that his prosthetic legs provided his ankles with increased springiness that allowed him to run faster with less exertion. Although this decision was eventually overturned, it remains highly contested whether prosthetic limbs provide competitors with a net advantage over able-bodied athletes.

When expanding into the discussion of human enhancement technologies, it is crucial to distinguish therapy, which aims to make people well, from enhancement, which aims to make individuals better then well by raising the performance ability of an individual beyond the level necessary to restore or sustain his or her health. With prostheses, it is assumed that an individual is being provided with a body part they are missing. Yet some instances, such as the Paralympics, exemplify how individuals with prosthetic limbs may be gaining more, in certain ways, than they originally lost. With regard to prostheses in running competitions, it remains unclear whether prosthetic technology has progressed to a point where prosthetic legs may prove advantageous over able-bodied competitors.

The debate over whether prosthetics should be allowed in sport leads to a wider discussion about the justice and fairness of human enhancement in the rest of life. It is crucial to ask, will enhancement technologies make our lives better? When extending beyond the impact of human enhancement on performance ability to consider how enhancement technologies will affect society’s collective well being and quality of life, the answer is probably no.

The increased diversification of enhancement forms – from strengthening bodies and memories to magnifying intelligence and happiness levels – force us to consider the longer term consequences of altering human nature and capacity. A great concern revolves around the role of human enhancement in transforming how we understand ourselves. The enhancement of human bodies and brains beyond their natural states ultimately questions the current definition of being human and suggests the need for new ways to understand ourselves within the context of enhancement technologies.

In addition to altering how we understand ourselves, advances in human enhancement manifest in social and political consequences by shifting the way in which we relate to and organize ourselves amongst each other. In making unenhanced individuals appear ‘disabled’ in the context of a constant need for improvement, human enhancement technologies will change the definition of being ‘impaired.’ Regardless of upstanding medical health, individuals will be considered “limited and defective” when facing new enhancement options that they have yet to partake in. Such a perspective equates healthiness with the successful maximization of an individual’s physical structure and mental capacity, while defining the diseased as those with unenhanced bodies and minds.

As observed with nearly every technology, new innovations allow for new inequalities to emerge, which are responsible for marginalizing certain groups of people. Human enhancement technologies will prove no different by influencing a widening of economic disparities. With the human body serving as the “newest frontier of commodification,” enhancement options will transition into an enabling technology for the wealthy yet a disabling one for lower income society. Therefore, it becomes pertinent to question, will human enhancement create an unenhanced underclass? Mostly likely, yes. Gregor Wolbring argues that we must transform our approach toward “distributive justice” in order to provide enhancements to those most in need. Recognizing this is as an unlikely course of action, however, he believes the next viable option is to ensure that enhancements fail to provide individuals with any sort of positional advantages in social hierarchy.

We return to the central question: should we be trying to make people better than well through technologies of human enhancement? According the Michael Sandel, a professor at Harvard University, the answer is firmly negatory. In the case of human enhancement, he argues, science is progressing at a faster rate than our moral understanding of the technology. Enhancement technologies present both opportunities to treat and prevent many debilitating diseases yet also difficulties in allowing us to “manipulate our own nature” by genetically engineering our bodies, minds, and moods to make ourselves better than well. Sandel implores the need to acknowledge that it is beyond our power to alter human capacity and ability. Ultimately, great danger lies in our ability reshape nature, specifically human nature, to accommodate our desires.

Technologies, although created by humans, may gain autonomy over society as they mature. This theory of ‘technological momentum,’ communicated by Thomas Hughes, gains resonance when considering the future implications of apps designed to dictate basic human actions. While apps such as GymPact and Snooze Minutes may be rationalized as insignificant reminders to work out or wake up, it is crucial to consider the repercussions of lock-in with such technologies. In the case of both of these apps, we observe how necessity, rather than science, proved to be the reason for, or “mother” of, invention.

GymPact

A pair of Harvard graduates, recognizing that people dislike losing money more than they appreciate financial gain, created GymPact to coerce people into exercising more often and more regularly. The app involves a two fold weekly commitment: 1) setting a target number of gym visits and 2) agreeing to a fine ranging from five to fifty dollars for each skipped gym visit. The actual reward for exercising is very small, about fifty cents, which rarely incentivizes cheating yet provides satisfaction to honest users. GymPact has over 135,000 users with a ninety-two percent effectiveness rate of inspiring exercise. The 2012 Pew Internet and American Life Project report found that approximately one-fifth of all smartphone users in the US have downloaded a fitness app since 2010 while a Mayo Clinic study highlighted how weight-loss program participants proved more likely to exercise and lose more weight with financial incentives.

Snooze Minutes

There are a multitude of apps designed to wake us up in the morning. iSleepin awakens users with a series of mini-games while Wake N Shake requires users to feverishly shake their devices in order to stop the alarm. The app that caught my attention, however, was Snooze Minutes. Besides featuring a digital clock face with the weather, the ability to adjust screen brightness with the slide of a finger, and the capacity to share wake progress on Facebook and Twitter, Snooze Minutes monetarily penalizes its users to ensure they wake up on time. When the alarm sounds, snooze minutes drain until the user stops the alarm. Simply pressing a button doesn’t silence it; users must trace their finger in a specified pattern to prove they are awake. After reaching seventy snooze minutes, users must buy more minutes to continue snoozing. Thirty snooze minutes cost $0.99, but there are packages of 333 snooze minutes with the price tag of $9.99. Alternatively, users can wait for their snooze minutes to recharge every twenty-four hours, making the option of buying snooze minutes even more ridiculous.

What is the greater significance of these apps?

The popularity of apps such as GymPact and Snooze Minutes demonstrate how users are not simply selecting technologies that will fulfill their needs, but rather, choosing technologies that will control their actions. Aware that these apps are value laden, users appreciate having their actions dictated in specific ways at certain times. GymPact and Snooze Minutes serve as scripts that prescribe certain behaviors while proscribing others. Using the example of a door-closer, Bruno Latour underscores how technology proves relentlessly moral, a quality that humans are transitioning to rely upon. By extending this example to apps, we observe how user reliance stems from the acknowledgement and acceptance of themselves as fallible. While we are often suspicious of technologies that seek to control us, GymPact and Snooze Minutes exemplify how users want technology to compensate for their own fallibilities, given that they no longer trust themselves to make decisions in their best interest.

As technology advances, it is relevant to consider how the increasing power of technology influences our daily decisions. Will humans blindly walk toward their own extinction by allowing technologies to dictate everything? Possibly. When innovators are focused on their own technological progressions, there lacks someone to survey how the combination of these advances impacts the world as a whole. The sequence of small, individually sensible advances may lead to an accumulation of powerful technologies that pose great dangers to our ability to control technology as well as ourselves. GymPact and Snooze Minutes are small-scale examples of technologies that may, in the future, advance to a point where users are unable to exercise or wake up without technological enforcement. We may ultimately find ourselves at the mercy of technology, not because we ceded control or the machines seized power, but because machine-made decisions inspire better results.

Do we subscribe to apps that control our basic human actions? Yes. Do we confront these technologies as problems? Not yet, an answer which may prove dangerous. While GymPact and Snooze Minutes may appear to be productive influences in our lives at present, we cannot fall into lock-in with these technologies. In adopting apps that dictate when to exercise or wake up, we are allowing technology to gain control, a concession that may extend into other aspects of our lives. The outcome of path dependence is irreversible, therefore, we must find a way to control and govern these technologies before they become too deeply entrenched.

Technology plays a crucial role when tracing the evolution of warfare throughout history. From swords to rifles to atomic bombs and horses to railroads to airplanes, technological advancement transforms the way in which wars are conceived, fought, and remembered. As we continue into the 21st century, however, military technologies are becoming increasingly radical from the weapons of the past.

Recently, ‘drone’ has transitioned into a buzzword when discussing military technology. Although sometimes equated to a toy airplane with a lawn mower engine, the drone is considered an exceptional surveillance and targeting technology. The impact of drone usage in the military is evident; the significance of advancing drone machinery and availability in the private sphere, however, has just begun.

John Stuart Foster Jr., a nuclear physicist and director of defense research and engineering at the Lawrence Livermore National Laboratory, conceptualized the first drone in 1971. Two years later, the Defense Advanced Research Projects Agency (DARPA) built two prototypes, Praeire and Calere, which could remain aloft for two hours with twenty-eight pound loads. By the mid 1990s, a process expedited by the US’ Cold War with Russia, DARPA built Predator, a drone that could loiter for twenty-four hours at 25,000 feet with a 450-pound load. By 2001, the Predator was outfitted with a laser guided missile and a camera, and by 2009, the US Air Force was training more drone-joystick pilots than airplane-cockpit pilots. Drones continue to advance; the Boeing’s Solar Eagle’s flight is scheduled for a five year flight in 2014 while the University of Pennsylvania drones are learning to think for themselves and have succeeded in plotting personal trajectories for flying through hoops thrown in the air. Zephyr, a British drone, recently broke the world record for flying 336 hours straight and reaching an altitude of 21,562 meters.

The notion of drones intervening in our daily lives may seem far-fetched at present, but as Collingridge claims, technologies cause unanticipated and often times, unwelcome, social repercussions. In The Social Control of Technology, Collingridge implores the need to govern technologies before they are too deeply entrenched in our lives yet recognizes the difficulty of predicting the social consequences of technologies before they are fully developed. Collingridge’s dilemma of control encompasses two parts. First, our poor understanding of the interactions between technology and society makes us unable to confidently predict the greater social consequences of a technology in its infancy. Second, by the time a technology is sufficiently developed and its greater ramifications have become evident, control is very difficult, costly, and slow. While Dave Guston acknowledges both our inability to predict the results of certain technologies and that we will experience lock in regardless of whether they are advantageous or harmful, he differs slightly in insisting that by explicitly considering a technology, we can shape its future progression with “anticipatory governance.” Bill Joy, however, describes technology as autonomous, meaning that science and technology follow an inevitable progression with which we must cope with and accept. He encourages us, however, to reconsider this fatalist perspective and find ways to challenge technological progression.

When considering drone technology within the context of Collingridge, Guston, and Joy’s arguments, one realizes that opportunities abound for drones to assert themselves in the private sphere with deep social consequences in unexpected yet irreversible ways. Fred Kaplan, a national security columnist for Slate, voices his concerns for the unintended consequences of drone warfare within the military. Since drones can kill targets from far away, anonymously, and without a risk of retaliation, Kaplan argues that drones may make fighting too easy. Army chaplain and ethics instructor Keith Shurleff expands on this sentiment in noting that “as war becomes safer and easier, as soldiers are removed from the horrors of war and see the enemy not as humans but as blips on a screen, there is very real danger of losing the deterrent that such horrors provide.”

Steps toward drone intervention in the private sphere may exemplify more potent threats to civilian life. Currently, military drone manufacturers are seeking to introduce remote sensing drones into private markets, such as domestic surveillance. This dramatic expansion of surveillance capacity, alongside developments in machine recognition of faces and monitoring of personal conversations, would allow for our private lives to become under intense scrutiny. Drones are low level and anonymous, yet present. While the information gathered from drone surveillance may hold deep consequences for those watched, drones themselves remain entirely unaffected given that they are machines without identities, secrets, or fears. Ultimately, an asymmetrical dynamic emerges in which drones are responsible for inspiring a new form of anxiety amongst the public yet prove unable to feel a reciprocal fear of retaliation. 1984 may turn into a reality by 2084.

The nature of a drone raises further questions regarding information ownership. Drones are physically present yet lack human senses of attachment, loyalty, or morality. As inanimate objects, they do not fear for their own lives nor sympathize with those whose lives they may take. While a drone may oversee a construction site, inspect for bridge damage, and equate to a household pet in the future, it will not possess a true identity. Provided that drones can and will work for anyone as they become easier to obtain and use, we ultimately must ask the questions: Who owns the information collected from drone surveillance? Who stores it? Who shares it?

Advancements in drone technology progress within an accompanying technological system. Drones are no longer a contained military surveillance and reconnaissance technology. In the coming years, we must grapple with the role of this weapon as it expands beyond the military domain and extends into private spheres of influence.