Implantable Medical Device Tells All: Uberveillance Gets to the Heart of the Matter

In 2015, I provided evidence at an Australian inquiry into the use of subsection 313(3) of the Telecommunications Act of 1997 by government agencies to disrupt the operation of illegal online services [1]. I stated to the Standing Committee on Infrastructure and Communications that mandatory metadata retention laws meant blanket coverage surveillance for Australians and visitors to Australia. The intent behind asking Australian service providers to keep subscriber search history data for up to two years was to grant government and law enforcement organizations the ability to search Internet Protocol–based records in the event of suspected criminal activity.

Importantly, I told the committee that, while instituting programs of surveillance through metadata retention laws would likely help to speed up criminal investigations, we should also note that every individual is a consumer, and such programs ultimately come back to bite innocent people through some breach of privacy or security. Enter the idea of uberveillance, which, I told the committee, is “exaggerated surveillance” that allows for interference [1] that I believe is a threat to our human rights [2]. I strongly advised that evoking section 313 of the Telecommunications Act 1997 requires judicial oversight through the process of a search warrant. My recommendations fell on deaf ears, and, today, we even have the government deliberating over whether or not they should relax metadata laws to allow information to be accessed for both criminal and civil litigation [3], which includes divorces, child custody battles, and business disputes. In June 2017, Australian Prime Minister Malcolm Turnbull even stated that “global social media and messaging companies” need to assist security services’ efforts to fight terrorism by “providing access to encrypted communications” [52].

Consumer Electronics Leave Digital Data Footprints

Of course, Australia is not alone in having metadata retention laws. Numerous countries have adopted these laws or similar directives since 2005, keeping certain types of data for anywhere between 30 days and indefinitely, although the standard length is somewhere between one and two years. For example, since 2005, Italy has retained subscriber information at Internet cafes for 30 days. I recall traveling to Verona in 2008 for the European Conference on Information Systems, forgetting my passport in my hotel room, and being unable to use an Internet cafe to send a message back home because I was carrying no recognized identity information. When I asked why I was unable to send a simple message, I was handed an antiterrorism information leaflet. Italy also retains telephone data for up to two years and Internet service provider (ISP) data for up to 12 months.

Similarly, the United Kingdom retains all telecommunications data for one to two years. It also maintains postal information (sender, receiver data), banking data for up to seven years, and vehicle movements for up to two years. In Germany, metadata retention was established in 2008 under the directive Gesetz zur Neuregelung der Telekommunikationsüberwachung und anderer verdeckter Ermittlungsmaßnahmen sowie zur Umsetzung der Richtlinie 2006/24/EG, but it was overturned in 2010 by the Federal Constitutional Court of Germany, which ruled the law was unconstitutional because it violated a fundamental right, in that correspondence should remain secret. In 2015, this violation was challenged again, and a compromise was reached to retain telecommunications metadata for up to ten weeks. Mandatory data retention in Sweden was challenged by one holdout ISP, Bahnhof, which was threatened with an approximately US$605,000 fine in November 2014 if it did not comply [4]. They defended their stance to protect the privacy and integrity of their customers by offering a no-logs virtual private network free of charge [5].

Some European Union countries have been deliberating whether to extend metadata retention to chats and social media, but, in the United States, many corporations voluntarily retain subscriber data, including market giants Amazon and Google. It was reported in The Guardian in 2014 that the United States records Internet metadata for not only itself but the world at large through the National Security Agency (NSA) using its MARINA database to conduct pattern-of-life analysis [6]. Additionally, with the Amendments Act in 2008 of the Foreign Intelligence Surveillance Act 1978, the time allotted for warrantless surveillance was increased, and additional provisions were made for emergency eavesdropping. Under section 702 of the Foreign Intelligence Surveillance Act of 1978 Amendments Act, now all American citizens’ metadata is stored. Phone records are kept by the NSA in the MAINWAY telephony metadata collection database [53], and short message service and other text messaging worldwide are retained in DISHFIRE [7], [8].

Emerging Forms of Metadata in an Internet of Things World

Figure 1. An artificial pacemaker (serial number 1723182) from St. Jude medical, with electrode, which was removed from a deceased patient prior to cremation. (Photo courtesy of wikimedia commons.)

The upward movement toward a highly interconnected world through the Web of Things and people [9] will only mean that even greater amounts of data will be retained by corporations and government agencies around the world, extending beyond traditional forms of telecommunications data (e.g., phone records, e-mail correspondence, Internet search histories, metadata of images, videos, and other forms of multimedia). It should not surprise us that even medical devices are being touted as soon to be connected to the Internet of Things (IoT) [10]. Heart pacemakers, for instance, already send a steady stream of data back to the manufacturer’s data warehouse (Figure 1). Cardiac rhythmic data is stored on the implantable cardioverter-defibrillator’s (ICD’s) memory and is transmitted wirelessly to a home bedside monitor. Via a network connection, the data find their way to the manufacturer’s data store (Figure 2).

The standard setup for an EKG. A patient lies in a bed with EKG electrodes attached to his chest, upper arms, and legs. A nurse oversees the painless procedure. The ICD in a patient produces an EKG (A) which can automatically be sent to a ICD manufacturer's data store (B). (Image courtesy of wikimedia commons.)

In health speak, the ICD set up in the patient’s home is a type of remote monitoring that happens usually when the ICD recipient is in a state of rest, most often while sleeping overnight. It is a bit like how normal computer data backups happen, when network traffic is at its lowest. In the future, an ICD’s proprietary firmware updates may well travel back up to the device, remote from the manufacturer, like installing a Windows operating system update on a desktop. In the following section, we will explore the implications of access to personal cardiac data emanating from heart pacemakers in two cases.

CASE 1: HUGO CAMPOS DENIED ACCESS TO HIS PERSONAL CARDIAC DATA

In 2007, scientist Hugo Campos collapsed at a train station and later was horrified to find out that he had to get an ICD for his genetic heart condition. ICDs usually last about seven years before they require replacement (Figure 3). A few years into wearing the device, being a high-end quantifiedself user who measured his sleep, exercise, and even alcohol consumption, Campos became inquisitive over how he might gain access to the data generated by his ICD (Figure 4). He made some requests to the ICD’s manufacturer and was told that he was unable to receive the information he sought, despite his doctor having full access. Some doctors could even remotely download the patient’s historical data on a mobile app for 24/7 support during emergency situations (Figure 5). Campos’s heart specialist did grant him access to written interrogation reports, but Campos only saw him about once every six months after his conditioned stabilized. Additionally, the logs were of no consequence to him on paper, and the fields and layout were predominantly decipherable only by a doctor (Figure 6).

Figure 4. The Nike FuelBand is a wearable computer that has become one of the most popular devices driving the so-called quantified-self trend. (Photo courtesy of wikimedia commons.)

Dissatisfied by his denied access, Campos took matters into his own hands and purchased a device on eBay that could help him get the data. He also went to a specialist ICD course and then intercepted the cardiac rhythms being recorded [11]. He got to the data stream but realized that to make sense of it from a patient perspective, a patient-centric app had to be built. Campos quickly deduced that regulatory and liability concerns were at the heart of the matter from the manufacturer’s perspective. How does a manufacturer continue to improve its product if it does not continually get feedback from the actual ICDs in the field? If manufacturers offered mobile apps for patients, might patients misread their own diagnoses? Is a manufacturer there to enhance life alone or to make a patient feel better about bearing an ICD? Can an ICD be misused by a patient? Or, in the worst case scenario, what happens in the case of device failure? Or patient death? Would the proof lie onboard? Would the data tell the true story? These are all very interesting questions.

Figure 5. The medical waveform format encoding rule software on a Blackberry device. It displays medical waveforms, such as EKG (shown), electroencephalogram, and blood pressure. Some doctors have software that allows them to interrogate EKG information, but patients presently do not have access to their own ICD data. (Photo courtesy of wikimedia commons.)

Campos might well have acted to not only get what he wanted (access to his data his own way) but to raise awareness globally as to the type of data being stored remotely by ICDs in patients. He noted in his TEDxCambridge talk in 2011 [12]:

the ICD does a lot more than just prevent a sudden cardiac arrest: it collects a lot of data about its own function and about the patient’s clinical status; it monitors its own battery life; the amount of time it takes to deliver a life-saving shock; it monitors a patient’s heart rhythm, daily activity; and even looks at variations in chest impedance to look if there is build-up of fluids in the chest; so it is a pretty complex little computer you have built into your body. Unfortunately, none of this invaluable data is available to the patient who originates it. I have absolutely no access to it, no knowledge of it.

Doctors, on the other hand, have full 24/7 unrestricted access to this information; even some of the manufacturers of these medical devices offer the ability for doctors to access this information through mobile devices. Compare this with the patients’ experience who have no access to this information. The best we can do is to get a printout or a hardcopy of an interrogation report when you go into the doctor’s office.

Figure 6. An EKG chart. Twelve different derivations of an EKG of a 23-year-old japanese man. A similar log was provided to hugo campos upon his request for six months worth of EKG readings. (Photo courtesy of wikimedia commons.)

Campos decided to sue the manufacturer after he was informed that the data being generated from his ICD measuring his own heart activity was “proprietary data” [13]. Perhaps this is the new side of big data. But it is fraught with legal implications and, as far as I am concerned, blatantly dangerous. If we deduce that a person’s natural biometric data (in this instance, the cardiac rhythm of an individual) belong to a third party, then we are headed into murky waters when we speak of even more invasive technology like deepbrain stimulators [14]. It not only means that the device is not owned by the electrophorus (the bearer of technology) [15], [16], but quite possibly the cardiac rhythms unique to the individual are also owned by the device manufacturer. We should not be surprised. In Google Glass’s “Software and Services” section of its terms of use, it states that Google has the right to “remotely disable or remove any such Glass service from user systems” at its “sole discretion” [17]. Placing this in the context of ICDs means that a third party almost indelibly has the right to switch someone off.

Enter the Ross Compton case of Middletown, Ohio. M.G. Michael and I have dubbed it one of the first authentic uberveillance cases in the world, because the technology was not just wearable but embedded. The story goes something like this: On 27 January 2017, 59-year-old Ross Compton was indicted on arson and insurance fraud charges. Police gained a search warrant to obtain his heart pacemaker readings (heart and cardiac rhythms) and called his alibi into question. Data from Compton’s pacemaker before, during, and after the fire in his home broke out were disclosed by the heart pacemaker manufacturer after a subpoena was served. The insurer’s bill for the damage was estimated at about US$400,000. Police became suspicious of Compton when they traced gasoline to Compton’s shoes, trousers, and shirt.

In his statement of events to police, Compton told a story that misaligned and conflicted with his call to 911. Forensic analysts found traces of multiple fires having been lit in various locations in the home. Yet, Compton told police he had rushed his escape, breaking a window with his walking stick to throw some hastily packed bags out and then fleeing the flames himself to safety. Compton also told police that he had an artificial heart with a pump attached, a fact that he thought might help his cause but that was to be his undoing. In this instance, his pacemaker acted akin to a black box recording on an airplane [18].

After securing the heart pacemaker data set, an independent cardiologist was asked to assess the telemetry data and determine if Compton’s heart function was commensurate with the exertion needed to make a break with personal belongings during a life-threatening fire [19]. The cardiologist noted that, based on the evidence he was given to interpret, it was “highly improbable” that a man who suffered with the medical conditions that Compton did could manage to collect, pack, and remove the number of items that he did from his bedroom window, escape himself, and then proceed to carry these items in front of his house, out of harm’s way (see “Columbo, How to Dial a Murder”). Compton’s own cardio readings, in effect, snitched on him, and none were happier than the law enforcement officer in charge of the case, Lieutenant Jimmy Cunningham, who noted that the pacemaker data, while only a supporting piece of evidence, was vital in proving Compton’s guilt after gasoline was found on his clothing. Evidence-based policing has now well outstripped the more traditional intelligence-led policing approach, entrenched given the new realm of big data availability [20], [21].

Columbo, How to Dial a Murder [S1] Columbo says to the murderer:“You claim that you were at the physicians getting your heart examined…which was true [Columbo unravels a roll of EKG readings]…the electrocardiogram, Sir. Just before three o’clock your physician left you alone for a resting trace. At that moment you were lying down in a restful position and your heart showed a calm, slow, easy beat [pointing to the EKG readout]. Look at this part, right here [Columbo points to the reading], lots of sudden stress, lots of excitement, right here at three o’clock, your heart beating like a hammer just before the dogs attacked…Oh you killed him with a phone call, Sir…I’ll bet my life on it. Very simple case. Not that I’m particularly bright, Sir…I must say, I found you disappointing, I mean your incompetence, you left enough clues to sink a ship. Motive. Opportunity. And for a man of your intelligence Sir, you got caught on a lot of stupid lies. A lot.” [S1] Columbo: How to Dial a Murder. Directed by James Frawley. 1978. Los Angeles, CA: Universal Pictures Home Entertainment, 2006. DVD.

Consumer Electronics Tell a Story

Several things are now of interest to the legal community: first and foremost, how is the search warrant for a person’s pacemaker data executed? In case 1, Campos was denied access to his own ICD data stream by the manufacturer, and yet his doctor had full access. In case 2, Compton’s own data provided authorities with the extra evidence they needed to accuse him of fraud. This is yet another example of seemingly private data being used against an individual (in this instance, the person from whose body the data emanated), but in the future, for instance, the data from one person’s pacemaker might well implicate other members of the public. For example, the pacemaker might be able to prove that someone’s heart rate substantially increased during an episode of domestic violence [22] or that an individual was unfaithful in a marriage based on the cross matching of his or her time stamp and heart rate data with another.

Of course, a consumer electronic does not have to be embedded to tell a story (Figure 7). It can also be wearable or luggable, as in the case of a Fitbit that was used as a truthdetector in an alleged rape case that turned out to be completely fabricated [23]. Lawyers are now beginning to experiment with other wearable gadgetry that helps to show the impact of personal injury cases from accidents (work and nonwork related) on a person’s ability to return to his or her normal course of activities [24] (Figure 8). We can certainly expect to see a rise in criminal and civil litigation that makes use of a person’s Android S Health data, for instance, which measure things like steps taken, stress, heart rate, SpO2, and even location and time (Figure 9). But cases like Compton’s open the floodgates.

Figure 8. A closeup of a patient wearing the iRhythm ZIO XT patch, nine days after its placement. (Photo courtesy of wikimedia commons.)

I have pondered on the evidence itself: are heart rate data really any different from other biometric data, such as deoxyribonucleic acid (DNA)? Is it perhaps more revealing than DNA? Should it be dealt with in the same way? For example, is the chain of custody coming from a pacemaker equal to that of a DNA sample and profile? In some way, heart rates can be considered a behavioral biometric [25], whereas DNA is actually a cellular sample [26]. No doubt we will be debating the challenges, and extreme perspectives will be hotly contested. But it seems nothing is off limits. If it exists, it can be used for or against you.

Figure 9. (a) and (b) The health-related data from Samsung's S Health application. Unknown to most is that Samsung has diversified its businesses to be a parent company to one of the world's largest health insurers. (Photos courtesy of katina michael.)

The Paradox of Uberveillance

In 2006, M.G. Michael coined the term uberveillance to denote “an omnipresent electronic surveillance facilitated by technology that makes it possible to embed surveillance devices in the human body” [27]. No doubt Michael’s background as a former police officer in the early 1980s, together with his cross-disciplinary studies, had something to do with his insights into the creation of the term [28]. This kind of surveillance does not watch from above, rather it penetrates the body and watches from the inside, looking out [29].

Furthermore, uberveillance “takes that which was static or discrete…and makes it constant and embedded” [30]. It is real-time location and condition monitoring and “has to do with the fundamental who (ID), where (location), and when (time) questions in an attempt to derive why (motivation), what (result), and even how (method/plan/thought)” [30]. Uberveillance can be used prospectively or retrospectively. It can be applied as a “predictive mechanism for a person’s expected behavior, traits, likes, or dislikes; or it can be based on historical fact” [30].

In 2008, the term uberveillance was entered into the official Macquarie Dictionary of Australia [31]. In research that has spanned more than two decades on the social implications of implantable devices for medical and nonmedical applications, I predicted [15] that the technological trajectory of implantable devices that were once used solely for care purposes would one day be used retrospectively for tracking and monitoring purposes. Even if the consumer electronics in question were there to provide health care (e.g., the pacemaker example) or convenience (e.g., a near-field-communication-enabled smartphone), the underlying dominant function of the service would be control [32]. The socioethical implications of pervasive and persuasive emerging technologies have yet to really be understood, but increasingly, they will emerge to take center stage in court hearings, like the emergence of DNA evidence and then subsequently global positioning system (GPS) data [33].

Medical device implants provide a very rich source of human activity monitoring, such as the electrocardiogram (EKG), heart rate, and more. Companies like Medtronics, among others specializing in implantables, have proposed a future where even healthy people carry a medical implant packed with sensors that could be life sustaining and detect heart problems (among others), reporting them to a care provider and signaling when assistance might be required [34]. Heart readings provide an individual’s rhythmic biometrics and, at the same time, can record increases and decreases in activity. One could extrapolate that it won’t be long before our health insurance providers are asking for the same evidence for reduced premiums.

The future might well be one where we all carry a black box implantable recorder of some sort [35], an alibi that proves our innocence or guilt, minute by minute (Figure 10). Of course, an electronic eye constantly recording our every move brings a new connotation to the wise words expressed in the story of Pinocchio: always let your conscience be your guide. The future black boxes may not be as forgiving as Jiminy Cricket and more like Black Mirror’s “The Entire History of You” [36]. But if we assume that these technologies are to be completely trusted, whether they are implantable, wearable, or even luggable, then we are wrong.

The contribution of M.G. Michael’s uberveillance is in the emphasis that the uberveillance equation is a paradox. Yes, there are near-real-time data flowing continuously from more points of view than ever [37], closed-circuit TV looking down, smartphones in our pockets recording location and movement, and even implantables in some of us ensuring nontransferability of identity [38]. The proposition is that all this technology in sum total is bulletproof and foolproof, omniscient and omnipresent, a God’s eye view that cannot be challenged but for the fact that the infrastructure and the devices, and the software, are all too human. And while uberveillance is being touted for good through an IoT world that will collectively make us and our planet more sustainable, there is one big crack in the utopian vision: the data can misrepresent, misinform, and be subject to information manipulation [39]. Researchers are already studying the phenomenon on complex visual information manipulation, how to tell whether data has been tampered with, a suspect introduced or removed from a scene of a crime, and other forensic visual analytics [40]. It is why Vladimir Radunovic, director of cybersecurity and e-diplomacy programs in the DiploFoundation, cited M.G. Michael’s contribution that “big data must be followed by big judgment” [41].

What happens in the future if we go down the path of constant bodily monitoring of vital organs and vital signs, where we are all bearing some device or at least wearing one? Will we be in control of our own data, or, as is seemingly obvious at present, will we not be in control? And how might selfincrimination play a role in our daily lives, or even worse, individual expectations that can be achieved by only playing to a theater 24/7 so our health statistics can stack up to whatever measure and cross-examination they are put under personally or publicly [42]? Can we believe the authenticity of every data stream coming out of a sensor onboard consumer electronics? The answer is no.

Having run many years of GPS data-logging experiments, I can say that a lot can go wrong with sensors, and they are susceptible to outside environmental conditions. For instance, they can log your location miles away (even in another continent), the temperature gauge can play up, time stamps can revert to different time zones, the speed of travel can be wildly inaccurate due to propagation delays in satellites, readings may not be at regular intervals due to some kind of interference, and memory overflow and battery issues, while getting better, are still problematic. The short and long of it is that technology cannot be trusted. At best, it can act as supporting evidence but should never replace eyewitness accounts. Additionally, “the inherent problem with uberveillance is that facts do not always add up to truth (i.e., as in the case of an exclusive disjunction T 1 T 5 F), and predictions based on uberveillance are not always correct” [30].

Conclusion

While device manufacturers are challenging the possibility that their ICDs are hackable in courts [43], highly revered security experts like Bruce Schneier are heavily cautioning about going down the IoT path, no matter how inviting it might look. In his acclaimed blog, Schneier recently wrote [44]:

All computers are hackable…The industry is filled with market failures that, until now, have been largely ignorable. As computers continue to permeate our homes, cars, businesses, these market failures will no longer be tolerable. Our only solution will be regulation, and that regulation will be foisted on us by a government desperate to “do something” in the face of disaster…We also need to reverse the trend to connect everything to the internet. And if we risk harm and even death, we need to think twice about what we connect and what we deliberately leave uncomputerized. If we get this wrong, the computer industry will look like the pharmaceutical industry, or the aircraft industry. But if we get this right, we can maintain the innovative environment of the internet that has given us so much.

The cardiac implantables market by 2020 is predicted to become a US$43 billion industry [45]. Obviously, the stakes are high and getting higher with every breakthrough implantable innovation we develop and bring to market. We will need to address some very pressing questions at hand, as Schneier suggests, through some form of regulation if we are to maintain consumer privacy rights and data security. Joe Carvalko, a former telecommunications engineer and U.S. patent attorney as well as an associate editor of IEEE Technology and Society Magazine and pacemaker recipient, has added much to this discussion already [46], [47]. I highly recommend several of his publications, including “Who Should Own In-the-Body Medical Data in the Age of eHealth?” [48] and an ABA publication coauthored with Cara Morris, The Science and Technology Guidebook for Lawyers [49]. Carvalko is a thought leader in this space, and I encourage you to listen to his podcast [50] and also to read his speculative fiction novel, Death by Internet, [51] which is hot off the press and wrestles with some of the issues raised in this article.

[15] K. Michael, “The technological trajectory of the automatic identification industry: The application of the systems of innovation (SI) framework for the characterisation and prediction of the auto-ID industry,” Ph.D. dissertation, School of Information Technology and Computer Science, Univ. of Wollongong, Wollongong, Australia, 2003.

[20] K. Michael, “Big data and policing: The pros and cons of using situational awareness for proactive criminalisation,” presented at the Human Rights and Policing Conf,. Australian National University, Canberra, Apr. 16, 2013.

[21] K. Michael and G. L. Rose, “Human tracking technology in mutual legal assistance and police inter-state cooperation in international crimes,” in From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (The Second Workshop on Social Implications of National Security), K. Michael and M. G. Michael, Eds. Wollongong, Australia: University of Wollongong, 2007.

[26] K. Michael, “The European court of human rights ruling against the policy of keeping fingerprints and DNA samples of criminal suspects in Britain, Wales and Northern Ireland: The case of S. and Marper v United Kingdom,” in The Social Implications of Covert Policing (Workshop on the Social Implications of National Security, 2009), S. Bronitt, C. Harfield, and K. Michael, Eds. Wollongong, Australia: University of Wollongong, 2010, pp. 131–155.

[29] M. G. Michael and K. Michael, “A note on uberveillance,” in From Dataveillance to Überveillance and the Realpolitik of the Transparent Society (The Second Workshop on Social Implications of National Security), M. G. Michael and K. Michael, Eds. Wollongong, Australia: University of Wollongong, 2007.

[37] K. Michael, “Sousveillance and point of view technologies in law enforcement,” presented at the Sixth Workshop on the Social Implications of National Security: Sousveillance and Point of View Technologies in Law Enforcement, University of Sydney, Australia, 2012.

Acknowledgment

A short form of this article was presented as a video keynote speech for the Fourth International Conference on Innovations in Information, Embedded and Communication Systems in Coimbatore, India, on 17 March 2017. The video is available at https://www.youtube.com/watch?v=bEKLDhNfZio.

Abstract

Back in 1998, I remember receiving my first second-generation (2G) mobile assignment at telecommunications vendor Nortel: a bid for Hutchison in Australia, a small alternate operator. At that time, I had already grown accustomed to modeling network traffic on traditional voice networks and was beginning to look at the impact of the Internet on data network dimensioning. In Australia, we were still relying on the public switched telephone network to dial up the Internet from homes, the Integrated Services Digital Network in small-to-medium enterprises, and leased lines for larger corporates and the government. But modeling mobile traffic was a different affair.

Figure 1. Russia’s Safe-Selfie campaign flyer.

Back in 1998, I remember receiving my first second-generation (2G) mobile assignment at telecommunications vendor Nortel: a bid for Hutchison in Australia, a small alternate operator. At that time, I had already grown accustomed to modeling network traffic on traditional voice networks and was beginning to look at the impact of the Internet on data network dimensioning. In Australia, we were still relying on the public switched telephone network to dial up the Internet from homes, the Integrated Services Digital Network in small-to-medium enterprises, and leased lines for larger corporates and the government. But modeling mobile traffic was a different affair.

I remember thinking: how will we begin to categorize subscribers, and what kinds of network patterns could we expect in mobility? I recall beginning to define the market segments into four categories, which included security (low-end users), road warriors (high-end corporate users), socialites (youth market), and everyday users (average users). Remember, this was even before the rise of the Wireless Application Protocol. As naysayers said that the capital expenditure spent on 2G networks would be prohibitive and that investments would never be recouped for decades, subscribers’ usage rapidly increased with devices like the Research in Motion Blackberry, which allowed for mobile e-mail.

Fast-forward to 2000. I was already knee-deep in thirdgeneration (3G) mobile bids, predicting the cost of 3G spectrum in emerging and developed markets, increasing my categories of subscriber types from four to nine segments, and calculating upload and download rates for top mobile apps like gaming, images (photos and imaging), and e-mail with chunky PowerPoint attachments and other file types. We knew what was coming was big, but perhaps we ourselves sitting on the coalface didn’t realize what a big impact it would actually have on our lives and the lives of our children. Our models showed average revenues per user of US$120 per month for corporates. At the time, most of us believed the explosion that would take place in the coming decade (but not as big as it turned out to be), despite preaching the mantra that voice is now just another bit of data. In calculating pricing models, we brainstormed with one another: who would spend over 100 min on a mobile? Or who would spend hours gaming on a handset rather than a larger gaming console?

Enter social media, enabled by this wireless Internet protocol (IP) revolution and the rapid increase in diverse mobile hardware from netbooks to tablets to smartphones and smartwatches. Then things rapidly changed again. LinkedIn, Facebook, Twitter, Instagram, Snapchat, and WeChat are all enjoyed by social media users (consumers and professionals) around the globe today, and it is estimated that there will be 2.67 billion social network users by 2018 [1]. Over one-third of consumers worldwide, more than 2.56 billion people, will have a mobile phone by 2018, and more than half of these will have smartphone capability, making feature phones the minority [2].

The Social Media Boom

When Google announced that a staggering 24 billion selfies were uploaded to its servers alone in 2015, consuming 13.7 petabytes of storage space, I stopped and contemplated the meaning of these statistics [3]. What about the zillions of selfies uploaded to Apple’s iCloud, posted to Facebook, Instagram, Snapchat, and Twitter? It means that in one’s average lifetime, most people are taking at least one selfie a day and sharing their image publicly. This figure is much higher for the impressionable teen market, with a 2015 Google study reporting that youth take, on average, 14 selfies and 16 photos or videos, check social media 21 times, and send 25 text messages per day [4]. This number continues to grow steadily, according to fresh evidence by Pew Internet Research [5], and is now even impacting workplace productivity [6]. In the same year that Google announced the selfie statistics, Russia’s Ministry of Internal Affairs began a Safe-Selfie campaign [7], stating: “When you take a selfie, make sure that you are in a safe place and your life is not in danger!” (Figure 1). This was followed, of course, by the acknowledged deaths that had occurred while younger and older individuals were in the process of taking selfies, and the rate of frequency outnumbered shark attacks in 2015 [8]. One cannot fathom.

Noticeable is the adoption of high-tech gadgetry, especially in the childhood to youth markets, with an even greater penetration by teenagers and individuals younger than 34 years. It is rather disturbing to read that 24% of U.S. teens go online “almost constantly” [5], facilitated by the widespread penetration of smartphones and increasing requirement of tablets in the secondary education system. The sheer affordability of tech gear and its increasing multifunctionality now means that most people have digital Swiss Army knives at their disposal with a smartphone. By accessing the Internet via your phone, you can upload pictures, browse websites, navigate locations on maps, and be reachable any time of the day. The allure of how to kill time while waiting for appointments or in public transportation means that most people are frequently engaged in some form of interaction through a screen. The short-lived Google Glass was a hands-free solution that would have brought the screen right up to the eye [9], but while momentarily halted, one can envisage a future where we are seeing everything through filtered lenses. Google Glass Enterprise edition is now on sale [35]!

The Rise of Internet Addiction

Experts have tried to quantify the amount of time being spent on screens, specific devices (smartphones), and even particular apps (e.g., Facebook), and have identified guidelines for various age groups for appropriate use. Most notable is the work started by Dr. Kimberly Young in 1995 when she established her website netaddiction.com and clinical practice, the Center for Internet Addiction. She has been conducting research on how the Internet changes people’s behavior. Her guideline “3-6-9-12 Screen Smart Parenting” has gained worldwide recognition [10].

Increasingly, we are hearing about social media addiction stories (see “Social Media Addiction” [11] and “Mental Health and Social Media” [36]). We have all heard about the toddler screaming for his or her iPad before breakfast and gamers who are reluctant to come to dinner with the rest of the family (independent of gender, age, or ethnicity) unless they are instant messaged. There is a growing complexity around the diagnosis of various addiction behaviors. Some suffer from Internet addiction broadly, while others are addicted to computer gaming, smartphones, and even social media. It has been postulated by some researchers that most of these modern technology-centric addictions are age-old causes, such as obsessive-compulsive disorder, but they have definitely been responsible for triggering a new breed of what I consider to be yet-to-be-defined medical health issues.

In the last five years, especially, much research has begun in the area of online addiction. Various scales for Internet addiction have been developed by psychologists, and there are even scales for specific technologies now, like smartphones. The South Koreans have developed the Smartphone Addiction Scale, Smartphone Addiction Proneness Scale, and the KS-scale, a method for Koreans to self-report Internet addiction using a short-form scale. Unsurprisingly, these scales are significant for the South Korean market, given it is the world leader in Internet connectivity, having the world’s fastest average Internet connection speed with roughly 93% of citizens connected. It therefore follows that the greater the penetration of highspeed Internet in a market, the greater the propensity for a subscriber to suffer from some form of online addiction. There are even scales for social media applications, e.g., the Bergen Facebook Addiction Scale (BFAS) developed by Dr. Cecile Andraessen at the University of Bergen in Norway in 2012 (see “BFAS Survey Statements” [12]).

Accessible Internet Feeds the Addiction

Despite its remoteness to the rest of the world, Australia surprisingly does not lag far behind the South Korean market. According to the Australian Bureau of Statistics, in 2013, 94% of Australians were Internet users, but regional areas across Australia do not enjoy the same high-speed access as in South Korea, despite the National Broadband Network initiative that was founded in 2009, with the actual rollout beginning in 2015. Yet, alarmingly, one recognized industry report, “Digital Down Under,” stated that 13.4 million Australians spent a whopping 18.8 h a day online [13]. This statistic has been contested but commensurately backed by Lee Hawksley, managing director of ExactTarget Australia, who oversaw the research. She has gone on record saying, “...49% of Australians have smartphones, which means we are online all the time…from waking to sleep, when it comes to e-mail, immersion, it’s even from the 18–65s; however, obviously with various social media channels the 18–35s are leading the charge.”

According to the same study, roughly one-third of women living in New South Wales are spending almost two-thirds of their day online. And it is women who are 30% more likely to suffer anxiety as a result of participating in social media than men [14]–[16]. This is even greater than the Albrecht and Michael deduction of 2014, which estimated that people in developed nations are spending an average of 69% of their waking life behind the screen [17]. That is about 11 h behind screens out of 16 waking hours. But, no doubt, people are no longer sleeping 8 h with access to technology at arm’s reach within the bedroom, and, as a result, cracks are appearing in relationships, employment, severe sleep deprivation, and other areas as a result of screen dependencies [18].

It is difficult to say what kinds of specific addictions exist in relation to the digital world, and various countries identify market-relevant scales and measures. While countries like China, Taiwan, and South Korea acknowledge there is something called “Internet addiction” as a diagnosed medical condition, other countries, e.g., the United States, prefer not to be explicit about the condition, such as in the Diagnostic and Statistical Manual of Mental Disorders, 5th edition (DSM-V) [19]. Instead, a potential new diagnosis, Internet gaming disorder, is dealt with in an appendix of the DSM-V [20], [21]. Generally, Internet addiction is defined as “the inability of individuals to control their Internet use, resulting in marked distress and/or functional impairment in daily life” [22]. Some practitioners have likened online addiction as being akin to substance-based addiction. Usually it manifests predominantly in one of three quite separate but sometimes overlapping subtypes in an individual: excessive gaming, sexual preoccupations [23], and e-mail/text/social media messaging [24].

Shared Data and the Need to Know

For now, what has been quantified and is well known is the amount of screen time spent by individuals in front of multiple platforms: Internet-enabled television (e.g., Netflix), play stations (for video games), desktops (for browsing), tablets (for pictures and editing), and smartphones (for social media messaging). Rest assured, the IP-enabled devices we are enjoying are passing on our details to corporations, who in the name of billing now accurately know about our family’s every move, digitally chronicling our preferences and habits. It is a form of pervasive social and behavioral biometrics, allowing big business to know app-by-app your individual thoughts. What is happening to all this metadata? Of course, it is being repurposed to give you more of the same, generating even more profit for interested businesses. For some capitalists, there is nothing wrong with this calculated engineering. Giving you more of what you want is the new mantra, but it does have its side effects, obviously.

The Australian Psychological Society issued its “Stress and Wellbeing in Australia” report last year, which included a section on social media fear of missing out (FOMO) [25]. Alongside FOMO, we also now have a fear of being off the grid, or FOBO, and the fear of no mobile, or NoMo [26]. I personally know adults who will not leave their homes in the morning unless they have watched the top ten YouTube videos of the day or won’t go to sleep until every one of those last e-mails has been answered and placed in the appropriate folder and actioned. Screen times are forever increasing, and this has come at the expense of physical exercise and one-toone time with loved ones.

There are reports of men addicted to video games who cannot keep a nine-to-five job, there are women suffering from depression and anxiety because they compare their online status with that of their peers, there are children who message on Instagram throughout the night, and there are those who are addicted to their work at the expense of all physical relationships around them. Perhaps most disturbingly are the increasing cases of online porn exposure by children between the ages of 9 and 13 years in particular [27], cybersexual activities in adolescence, or extreme social-media communities that spread disinformation. If it’s being conducted virtually, then it must not be real, with no physical repercussions; but far from it, online addictions generate a guilt that remains and is hard to rid. This is particularly true of misdemeanors published to the websphere that can be played back, disallowing individuals to forget about their prior actions or break out of stereotypes [28].

A New Tool: The AntiSocial App

FIGURE 2. The app AntiSocial measures the number of unlocks. (Image courtesy of BugBean.)

Endless pings tend to plague smartphone users if their settings have not been tweaked for anything but a default [29]. Notifications and alerts are checked while users are driving (even if it is against the law to text and drive), in the middle of a conversation, in bed while being intimate, while using the restroom, or even while taking a shower. But no one has ever measured the end-to-end use through actual surveying of digital instrumentation in an open market setting. It has been left to self-reporting mechanisms or applications that have run on a desktop that might monitor how long workers use various work applications or are on e-mail or closed surveying of populations participating in trials. But manual voluntary audit logs are often incomplete or underreport actual usage, and closed trials are not often representative of reality. At best, we can point to the South Korean smartphone verification and management system that has helped to raise awareness that such a system is needed for intervention [30]. And yet, the concern is so high that we can say with some confidence that it won’t take long for companies to come out with socially responsible technologies and software to help us remain in the driver’s seat with some confidence.

FIGURE 3. AntiSocial measures app usage in minutes, allowing the user to limit or block certain apps based on a predefined consumption. (Image courtesy of BugBean.)

Enter the new app called AntiSocial, created by Melbourne, Australia, software company BugBean, which has consumer interests at heart [31]. Antisocial.io has taken the world by storm and has been downloaded on GooglePlay by individuals in over 150 countries within just a few months. The fact that it ranked number three on the U.K. GooglePlay downloads after only a few days demonstrates the need for it. It will not only accurately record usage in multiple application contexts but also encourage mindfulness about usage. AntiSocial does not tell users to stop using social media or stop video gaming for entertainment, but it reminds people to consider their digital calorie intake by comparing their behaviors with other anonymous users in their age group, occupation, and location. It is not about shaming users but raising individual awareness and wasting less time. We say we are too busy for this or that, and yet we don’t realize we are getting lost and absorbed in online activities. How do we reclaim some of this time [32]?

It may well be as simple as switching off the phone in particular settings, deliberately not taking it with you on a given outing, or having a digital detox day once a week or once a month. It might be taking responsibility for the length of screen time you have when you are away from the office or using AntiSocial to block certain apps after a self-determined amount of time has been spent on the app on any given day [33]. Whatever your personal solution, taking the AntiSocial challenge is about empowering you, and letting you exploit the technology at your fingertips without it exploiting you.

The AntiSocial App will Help

FIGURE 4. AntiSocial benchmarks smartphone app usage against others in the same age group, occupation, and location. (Image courtesy of BugBean.)

Some of the social problems that arise from smartphone and/ or social media addiction in particular include sleep depravity, anxiety, depression, a drop in grades, and anger management issues. AntiSocial provides a count of the number of unlocks you perform on your handset (Figure 2) and tells you in minutes how long you use each application (Figure 3), including cameras, Facebook and Instagram, and your favorite gaming app. It will help you to compare yourself against others and take responsibility for your use (Figure 4). You might choose to replace that time spent on Facebook, e.g., with time walking the dog, helping your kids with their homework, or even learning to cook a new recipe [34]. There is also a paired version that can be shared between parents and their children or even colleagues and friends. You might like to set yourself a challenge to detox digitally, just like you might do at your local gym in terms of fitness and weight loss. Have fun within a two-week timeframe, declaring yourself the biggest loser (of mobile minutes, that is) and report back to family and friends on what you feel you have gained. You might be surprised how liberating this actually feels.

You’ll come away appreciating the digital world and its conveniences a great deal more. You’ll also likely have a clearer head and not be tempted to snap back a reply online that might hurt another or inadvertently hurt yourself. And you’ll be able to use the AntiSocial app to become more social and start that invaluable conversation with those loved ones around you in the physical space.

You Want to Do What with RFID?: Perceptions of radio-frequency identification implants for employee identification in the workplace.

Electronic employee identification (ID) has transformed the workplace. Handheld tokens, such as contactless smartcards and wearable clip-on infrared badges, are now fundamental to security practices across the globe. Medium-to-large organizations continually stress the importance of employees carrying their staff cards at all times and displaying them for security purposes. Staff badges have been increasingly linked to physical access control in buildings, dynamic computer log-in, and even for e-payment using stored value. Most employees carry their electronic ID in a plastic sleeve attached to a lanyard, but given the card can be removed, it can be left behind, misplaced, or stolen.

Biometric systems have been used to log employee hours for payroll and for registering time and attendance since the late 1980s. However, given the cost of biometric readers, dispersing them around a large, closed campus for access control purposes is considered prohibitive. Radio-frequency identification (RFID) implants have been touted by proponents as being more secure, nontransferable, and an overall cheaper solution with the potential for multifunctionality and multiapplication growth. They also have the ability to be programmed dynamically. In 2004, in the aftermath of 9/11, VeriChip had an RFID chip approved for use by the U.S. Food and Drug Administration that it believed could be used to ensure employee safety, among other uses. By 2005, the Baja Beach Club chain had introduced the VeriChip to several of its clubs, and in 2006, the small business Citywatcher.com had likewise offered implants to its employees.

Literature Review

Electronic RFID implants are capable of omnipresent electronic surveillance. RFID tags or transponders can be implanted into the human body to track the who, what, where, when, and how of human life [1]. This act of embedding devices into human beings for surveillance purposes is known as uberveillance (see www.uberveillance.com) [2]. While the tiny embedded RFID chips do not have global positioning capabilities, an RFID reader (fixed or mobile) can capture time stamps, exit and entry sequences to denote when someone is coming or going, and in which direction he or she is traveling and then make inferences on time, location, distance, and speed. For the greater part, RFID microchips in the wearable form have been used to track prison inmates, hospital patients, or visitors in various market niches.

However, their use is now being considered in a variety of human-centric military applications for soldiers (e.g., dog tags) and for the tracking of suspected terrorists and convicted pedophiles [3], [4], [5]. Alzheimer’s sufferers, along with dementia patients, are also being considered as potential recipients of RFID microchips to aid in identification and as wander alerts [6], [7]. But it is the first time that we are witnessing the deployment of RFID implants for the sole purpose of convenience, for example, for ease of access to a premises as opposed to using security measures at the location to counter the act of unauthorized access.

Recent Momentum of Implantable Devices

Between 2014–2016, international media covered numerous Internet of Things stories that make this article timely. In April 2014, GroupM’s Irwin Gotlieb said that the “Wearable is cool, but the next form of media will be implantable—devices which are implanted in the human body” [8]. Director of engineering at Google, Ray Kurzweil, concurred that we would have “millions of blood-cell-sized computers in our bloodstream” within ten to 20 years [9]. In June 2014, IEEE Spectrum reported that Medtronic wanted to implant sensors in everyone [10].

In November 2014, Peter Diamandis, well-known chief executive officer of the X PRIZE Foundation and cofounder of Singularity University, got a near-field communication (NFC) implant on the spur of the moment at the Singularity Summit in Amsterdam. He said in his own blog: “Many big companies like Apple, Samsung, and Google are working on technology to measure your biology from outside of your body. Wearable devices ranging from watches to contact lenses will track everything…footsteps, heart rate, blood glucose, blood pressure, and other critical vitals. The challenge is that they only work when you remember to wear them, and there are some things you can’t measure from the outside. The question is: when would you be ready to start incorporating technology into your body?” [11].

To demonstrate that this thinking about next-generation information technology is not isolated to the United States, in December 2014, eight Swedes held an implant party in Stockholm. BBC News reporter Jane Wakefield noted that Hannes Sjoblad, chief disruption officer and founder of BioNyfiken (of Epicenter), hoped that his implant party would spark a conversation about our possible cyborg future. He said: “The idea is to become a community that is why they get implants done together… People bond over the experience and start asking questions about what it means to be a man and machine… Curiosity is one of the biggest drivers for us humans. I come from a maker hacker culture and I just want to see what I can do with this” [12]. In January 2015, it was reported by the BBC that a hi-tech office block in Sweden known as Epicenter was granting employees the option to take a microchip implant under the skin for physical access control to the building, among other functions [13]. As of April 2017, there are 150 employees of 2000 in the EpiCenter complex that are microchipped. And most of these chippings take place at “implant parties” [14]. In August 2015, Lloyds Bank announced that about 7% of U.K. consumers would adopt microchip implants in their body for making electronic payments [15]. In September of the same year, Kaspersky Labs became intrigued with the security issues related to microchip implants and engaged Sjoblad to participate in its Asia Pacific Cyber Security Summit in Malaysia to demonstrate the implantation process [16].

In January 2016, Andreas Sjöström used an NFC chip implanted beneath his skin as a boarding pass on a Lufthansa aircraft to travel from Stockholm’s Arlanda Airport [17]. He claimed it was purely for experimental purposes. In September of 2016, Shanti Korporaal launched ChipMyLife, an Australian distribution service, with husband Skeeves Stevens [18]. She uses her implant to enter the physical premises of her building [19].

This is all while DangerousThings.com has been creating a recognized brand with NFC/RFID implant solutions for biohackers since 2013. Visiting the home page of Dangerous Things, one is greeted by the following messages: “We believe biohacking is the forefront of a new kind of evolution” and “RFID/NFC next-level body augmentation.” On the “About Us” page, it is noted, “We believe our bodies are our own, to do with what we want. The ‘socially acceptable’ of tomorrow will be defined by boundaries pushed today, and we’re excited to be a part of it” [20].

Amal Graafstra then launched the myUKI concept, a multiapplication RFID chip, which has now been rebranded as Vivokey, with integrated smartphone features [21], [22]. He writes: “You can be you (and nobody else can), anywhere, all the time. You + VivoKey means your biological and digital identities can be cryptographically merged, ensuring the one true you is the only you using your devices, sending your messages, reading your e-mail, accessing your accounts, opening your doors, driving your vehicles, and spending your money” [23].

Methodology

The transnational quantitative survey was held 4–18 April 2011 that took an average of 10 min to complete each online survey and included one optional open-ended question. Participants, who were small-business owners (N = 453) within four countries, including Australia (n = 114), India (n = 111), the United Kingdom (n = 111), and the United States (n = 117), were asked “How would you personally feel about being implanted for ease of identification with your own organization?” Relative to gender, 51.9% of participants were male; 48.1% were female. The participants ranged from 18 to 71 years of age, the mean age was 44, and the median age was 45. Eighty percent of organizations surveyed had fewer than five employees. The chi-square analysis on the quantitative data was presented at 2014 IEEE Norbert Wiener; no significant chi-square analyses were reported with respect to countries of residence and religious issues, social issues, and cultural issues [24].

The study employed one instrument that collected key data relative to the business profile, the currently utilized technologies for identification and access control at the organization, and the senior executives’ perceptions of RFID implants in humans for identification and access control in organizations. Twenty-five percent of the small business owners that participated in the survey said they had electronic ID access to their premises. Twenty percent of small business owner employee ID cards came equipped with a photograph, and less than 5% stated they had a security breach in the 12 months preceding the study. Of the total number of respondents, 41% provided a comment to the open-ended question. These are collated and coded below thematically. A concept map was also generated using the content analysis tool Leximancer (Figure 1).

Results: Reasons for Rejection

The following are the responses expressed for rejecting the concept.

Negative Feelings

1) Disagree 2) Object 3) Hate it 4) Would not like it 5) Refuse 6) Not agree 7) Against 8) Yuk 9) Abhorrent 10) Absolutely vehemently opposed 11) Not happy 12) Not positive 13) No way in hell 14) What an unimaginably appalling idea 15) Horrified 16) I would leave the job 17) Totally against 18) I don’t think it would ever be appropriate to implant devices into the body for such a trivial thing 19) Never 20) Utterly unacceptable 21) I find the thought appalling 22) Absolutely not 23) Object strongly 24) I would detest it 25) Don’t like the idea 26) I would sooner stick pins in my eyeballs 27) I would definitely not allow anything to be implanted in my body for any reason, let alone for work purposes. I would not do it, if I was paid or even if I was to lose my job if I refused. 28) Absolutely not! It’s insane! 29) Negative feelings 30) Way too extreme 31) I would refuse. This should be illegal. 32) Would leave my job 33) Honestly, though I wholeheartedly embrace new technologies and the benefits that can be gained from such, this would make me uncomfortable…it’s just creepy. 34) I would not like it and not approve of it at all 35) I think it’s an awful idea 36) Not a chance 37) I wouldn’t be. I don’t believe in it. 38) I think it’s absolutely ludicrous and I would never do it 39) I would not agree to it nor would I work for someone that did 40) Would not do it under any circumstance 41) I would never support this. Ever. 42) No, no, no 43) Only if I’m drugged and kidnapped—then wake up in a motel bathtub full of ice with my kidney removed and an RFID chip implanted in the back of my hand. Then I would be okay about it. 44) Unwarranted 45) It’s difficult for me even I myself will disagree 46) I would not do it for my company 47) I would be apprehensive about it 48) Highly doubtful 49) Ridiculous.

Figure 1. The concept map of relationships of terms in responses.

Inhumanity

1) Sick 2) Our shop floor employees and executives are not pets 3) This sounds hideous, and inhuman 4) This is a disgraceful suggestion. The company does not own the employees. Slavery was abolished in developed countries more than 100 years ago. How dare you even suggest such a thing? You should be ashamed. 5) I absolutely would not allow an employer to implant me like a ... dog! 6) Animals are microchipped not humans 7) Feel like an experimental animal 8) Seems very robotic 9) No way! We’re not pets. 10) Absolutely not! It’s insane!

1) We have no plans to introduce this invasive procedure 2) I would never work for an organization that would require an RFID, and I sure would never expect or request my current or future employees to be violated like that 3) Invasive both physically and psychologically 4) Invading my bodies privacy 5) Violated 6) I would be reticent, given that there are less intrusive options 7) Against it—too invasive 8) I am totally against this procedure because it is a violation of one’s body and privacy 9) It would be invasive.

Invasion of Privacy

1) Personal privacy concern 2) Feels a bit like Big Brother 3) Could be used by outside world, say police 4) I wouldn’t like it—privacy issues 5) Would feel that Big Brother is watching over me 6) I feel it would be an invasion of privacy 7) It would never happen. I think it’s a Big Brother theory and morally and culturally wrong. 8) I would not do it. I do not like that idea, too Big Brother 9) It seems too Big Brother to me 10) It will never happen—I am against this—Big Brother is here 11) I am totally against this procedure, because it is a violation of one’s body and privacy 12) Not at all want or like. Too Big Brother. 13) Do not want to be implanted with anything that gives away my identity so easily. I would want to be in control of when how where to disclose my identity. 14) Feels a bit like Big Brother 15) It would never happen. I think it’s a Big Brother theory and morally and culturally wrong.

Invasion of Human Rights

1) A total violation of human rights 2) Violation of human rights 3) I would refuse, as I would see it as an invasion of my civil liberty 4) Should not be implanted as it breaches individual rights. Religious Issues 1) I do not believe you should put things in your body that God did not supply you with 2) Don’t need it so wouldn’t do it. Also religious reasons. 3) I would not want it, I do believe in the mark of the beast in my religion, and that is getting a little too close to home for me 4) I would hate it! It goes against all that I believe in.

Cultural Issues

1) Too sophisticated for our community 2) I would hate it! It goes against all that I believe in. 3) There will be division between those who have implants for ID and those have who have the current technology for ID. 4) It would never happen. I think it’s a Big Brother theory and morally and culturally wrong.

Health Issues

1) Maybe there is an issue with occupational health and safety 2) Health implications: 1) Do we know everything about side effects? 2) Maintenance, isn’t repairing/replacing/upgrading a lot more impractical? 3) I would be against it. There are too many health issues that are still unaddressed when it comes to implanting a foreign object in one’s body. 4) Would not allow it. Health risks due to chronic disease. I have severe rheumatoid arthritis. 5) I would worry about possible health issues 6) It could cause health issues—it is not for everyone 7) I do not feel anything for this implant of RFID because it can harm the human body.

Cosmetic Issues

1) Wouldn’t want another scar on body 2) Personally, I won’t entertain because of the marks left on the skin. Inconvenience 1) I would feel extremely uncomfortable with this 2) Feel that it is an imposition 3) I do not think I would be too pleased about it 4) Be bothered and uncomfortable 5) Very much complicated.

Unwarranted

1) Strongly against: there is no need for it 2) I am not against this type of device, but I would not use it simply for business security 3) Unnecessary 4) Small enterprises/offices do not need it 5) Not needed 6) Not relevant 7) Company is too small 8) I don’t see the need for it within our small organization 9) As we are only a small firm, we are familiar with all our employees 10) Not worthwhile for such a small business but would refuse 11) Not relevant for my small organization 12) There’s no need. We are too small 13) It is not needed when I am an investor only with equities 14) As the employees in my firm are part-time based, the firm is not interested in implanting the chips 15) It’s good, but the need for such high security measures is something unnecessary—again this creates more problems from the staff.

Cost Issues

1) Presently I am not ready for the same, as it will be quite expensive for organization of my size 2) Not interested as this involves cost.

Reasons for Acceptance

The following are the responses expressed for accepting the concept.

Positive Feelings—Smart Idea

1) It is a smart idea 2) Very necessary 3) I think it would be a good idea 4) Never opted for that idea but surely would like to try it 5) It will be good 6) God’s gift 7) It would be good 8) Very good 9) Good innovation 10) Good 11) I think don’t have a problem in implantation 12) Very much 13) I am open to the idea of getting an implant 14) It is good technology 15) It’s good 16) It is good 17) It’s ok with me 18) So much 19) It’s new concept for me as well, but I like the concept 20) Welcome.

Neutral Feelings—Indifferent

1) Wouldn’t bother me 2) Would have no issue with the technology 3) Indifferent 4) Fine 5) I feel okay 6) Nothing special 7) No issues 8) I do not think any harm in this 9) Neutral 10) No hassle 11) Don’t care 12) No issue 13) Normal.

Convenience Value

1) Simple use and low cost 2) Very easy to identify employers 3) I feel it is better option to prescribe the identification method 4) I would feel absolutely in business 24×7 5) It would make it much more convenient to never lose or forget one’s ID.

Enhanced Security

1) Feel secure 2) As an owner or being in long-term association with the company will make me feel secure about my work and position 3) It is very secure and is very useful in our organization 4) It is safe and secure 5) This creates security with regard to business.

The Cool Factor—Innovativeness

1) Cool 2) Proud 3) As I am the owner/sole proprietor of the organization, I am very proud to be associated in such business 4) It is good to start a new technology.

Enhanced Privacy

1) Great, I don’t have to carry a tag showing my identity to all Conditional Acceptance The following are the responses expressed for conditionally accepting the concept.

Job Retention

1) Would submit if it meant losing my job otherwise. Power or Influence 1) If it is necessary, I would do it 2) If top official finds something required to upgrade the identification and security in the organization, they might implement the changes. Business Need 1) If we felt we needed it, then no problem.

Maybe: Undecided

The following are the responses expressed by those who were undecided about the concept. Awareness 1) I wouldn’t mind, but I would want more information like is it safe? How is it administered? How is it removed? Where would it be put? Will it leave a scar? Are there any other complications (my personal security, tracking, global positioning system, and so on)? 2) Uncomfortable about it until I understood it completely and had seen it demonstrated and tested for any side effects and so on 3) I would prefer to implant but would like know in depth about the viability 4) No clue 5) No idea 6) Don’t know 7) Undecided.

Overall Value—Benefits versus Sacrifices

1) It is very useful, but at the time, it is also risky.

Discussion

Thematic Analysis

The results presented previously have been codified into similar themes. By far, there were more perceived reasons to reject implants as electronic employee ID than to accept them. What we saw from the codification exercise was that those participants who responded to the open-ended question had a clear-cut perceived reason as to why they would accept or reject being implanted for employee ID. That reason may have been expressed in a feeling rather than an identifiable reason, but nonetheless, the respondents were absolute in their response. Only a small number of respondents, 12 in total, were truly undecided in the maybe category or would conditionally accept the concept. This indicates, more than anything else, that microchipping people in the workforce is a divisive issue: Either an individual is vehemently opposed to the idea of being microchipped for employee ID or he or she is willing to accept the implant with some certainty. Out of the 186 responses received from the total number of respondents (N = 453), only 48 people perceived they would accept an implant for employee ID purposes in the workplace. About 13 of those 48 people were indifferent to the type of technology instituted with respect to employee ID, stating they would not be bothered, were neutral about, or it was no hassle whether it was an implantable device.

Content Analysis

Table 1 provides a presentation of the words that came up frequently in the content analysis of the open-ended question, “How would you personally feel about being implanted for ease of identification with your own organization?” The terms most frequently recorded by the transnational participants in describing their response to implants in the workplace included: feel, issues, body, absolutely, happen, Big Brother, work, security, idea, implant, employees, need, technology, uncomfortable, believe, and leave. This demonstrates that, holistically, respondents were aware of the potential to implant people for identification purposes but felt there were many issues, that is, barriers to acceptance, namely those issues related to surveillance (i.e., Big Brother).

The question itself caused some people to feel uncomfortable and to consider leaving if mandatory implantation was enforced. It is important to state, however, that most proponents of implantables (e.g., Retherford of Citywatcher.com) do not believe that implantables in the workplace will ever be imposed but will be presented as one of several options and that, over time, the convenience will overcome any doubts or fears end users might be harboring personally [25]. Concept Map The relationship between terms found frequently in the participant responses is presented in the concept map found in Figure 1. The most frequently used terms are represented by larger nodes, for example, body, absolutely, feel, and issues, and their proximate colocation with other terms indicate a contextual power relationship. Reading this concept map, we can deduce that, even if RFID implants were introduced for security purposes in the workplace, employee sentiment would generally be one of feeling uncomfortable. And whether this was due to Big Brother, or other beliefs (e.g., religious, cultural, or societal), the absolute response would be to go so far as leaving that job and workplace.

Conclusion

This article presented and analyzed the results of a transnational study that surveyed small business owners, asking the question: “How would you personally feel about being implanted for ease of identification with your own organization?” Perceived reasons for rejecting an employee ID implant vastly outnumbered the perceived reasons for accepting an employee ID implant. Interestingly, only a small number of respondents were undecided or provided conditional acceptance, indicating that the question is very divisive and that people hold absolute personal feelings toward the piercing of their skin with a foreign device (e.g., a microchip). We can deduce more broadly from these results that a new digital divide is potentially forming between that smaller group who is either indifferent or feels that implanting people for ID is a smart idea. Given the study was conducted in April 2011, we also need to continue the quantitative and qualitative investigation with follow-up surveys that have been conducted in 2012 and 2013. For now, much media hype persists in one-off experiments that show people adopting implants mostly for physical access to assets (e.g., cars) or buildings simply as a replacement for keys or cards. Yet, these experiments are beginning to demonstrate how humans can be human-plus through body modifications, and may well be influencing citizens to experiment with new technologies in unprecedented ways.

14. A Swedish start-up has started implanting microchips into its employees, CNBC, April 2017, [online] Available: http://www.cnbc.com/2017/04/03/start-up-epicenter-implants-employees-withmicrochips.html.

In today's postmodern Western world, there is a greater propensity toward consumerism. Mass-market production coupled with international trade means that you can buy just about anything made anywhere with the simple click of a mouse. Not only are we seeing the commoditization of things (i.e., material objects), but also businesses and industries are capitalizing on this consumerist mentality, studying individuals' buying habits to demographically target their market. This data mining is done through a multiplicity of ways, such as through technological monitors called sensors. Sensors capture humancentric data at discrete intervals, generating big data that draws out patterns. Behavior can actually be seen as a type of commoditization, not of the product or service but rather of the consumers themselves. And yet, despite these trends toward mass consumption of material goods and monitoring consumer behavior, sociologists are grappling with how Western civility is radically turning from the accumulation of external commodities, such as goods and services, to viewing one's own body as a form of human capital—to utilize as an outer expression of the self—whether in part or in whole.

In 2010, Michael et al. [1] wrote an article on the Web of Things and People in a special issue, “Radio-Frequency Identification (RFID) Innovation” in Proceedings of the IEEE. The article described a world in which every object could be connected to the Internet and how society was undergoing a paradigm shift in which “human connectivity” was paramount to the “connectivity of things” notion. Albeit through the ability to surveil people with location-based services (e.g., using smartphones), wearables that are strapped to the wrist (e.g., quantified-self devices), unique forms of identification applied to the human body (e.g., microchip implants), or permanent prints on the body that are deep in symbolism (e.g., tattoos), the body is increasingly becoming a hub for outward expression through decorative art as well as gaining the potential for ambient intelligence through technology. Our bodies go everywhere we go; they can be directly seen by onlookers, they are permanent, and we cannot live without them, but they are also limited in span and size. Our personhood is encapsulated within our bodies (i.e., major organs like the brain that denote our personality), but we also have an outward appearance that is a type of visual biometric, given what we choose to do with our skin and bodies both on the surface and transdermal layers (i.e., beneath the skin).

The voice of the citizen that was once mainly exemplified in various public forums has now radically turned inward—placing the emphasis on one's self as a medium to convey the message of choice [2]. In addition, it can be argued that this message largely connotes a desire to be heard. Forms of self-expression, such as body piercing, tattooing, scarification, chipification, and the like, are exonerated through the mere fact that these acts are largely going unchecked, whether viewed as non-self-invasive, harmless, or radical [3]. Such control over one's own body through alterations and modifications can be grouped together and argued to convey a universal message, a message that heralds a collective statement that the human body is in fact a form of personal physical capital [4] and therefore fully within one's rights to self-legislate.

Societal Norms, Planned Obsolescence, and Technological Adoption

Some theorists argue that there are mechanisms of control, such as with those controlling the message in the media, that are softly coercing active citizenry into a state of docility—conforming to expected societal norms of the dominant class of influence, while uncritically accepting rapid change such as that which is found in a highly advanced society led by rapid technological growth [5]. We observe this claim as people continue to feel the need to purchase the latest high-tech gadgetry, whether it is the latest smartphone, tablet, smart TV, or even a drone. We are no longer satisfied with a functional device; it must be the device with all the latest bells and whistles. Others argue that conformance is much more social and is the result of the individual adopting culturally bound practices within one's defined subculture [6]. A third perspective is a top-down approach that views technology and the need for adherence as being the result of an organizational governmental endeavor [7] to ensure civil order and eradicate social injustice, which one day will aid in bringing about worldwide emancipation [8]. We note this in the mandatory adoption of certain ID cards for transport and Social Security or even in the planned obsolescence of products developed by companies to ensure the consumer is locked in to an endless array of upgrades [9]. Commodities are not built to last because it means the individual will remain a lifelong consumer—ensuring continued business while fueling the consumerist mentality.

Yet, it can be argued, in this endeavor to maintain social order—which also enables civility to live in community, function in commerce, and progress toward self-maturation [10]—that technological change (which highly endorses systematic order) often becomes so restrictive that it turns out to be repressive to those who are subject to it. In such instances, governing technological mechanisms are often deemed as a form of top-down control. As a way for the individual to break free from the limitations of the systematic straitjacket (which can be argued to highly parallel Weber's “iron cage” of rationality [11]), the restricted self may react in a variety of ways. Such reactions may include demonstrating disapproval outside the jurisdiction of one's own “locus of control” [12]. Moving beyond such confinements is often deemed defiant in nature and subject to various penal measures that may even result in punishment not equal to the crime [13].

Empowerment through the Embodied Self

This fear of discipline can then be seen as encouraging society to turn its outward objective gaze, which once strove to understand society as an organic political whole, to the embodied “self” who looks to one's body as a safe medium to reflect opinion that is void of external punitive repercussions. Whether this tendency occurs instinctively, or even intuitively, it is here that social science can look to the human body as a sign carrier [2] and ask the question as to whether subcultural groups, such as the body alteration and tattooing movement (thus far being grouped as modern primitives [14]), are unconsciously working together to reaffirm the rights of the autonomous self to govern changes to one's physical being and whether this movement is growing as the result of a loss in public forums once present to the general public as a means to freely express is yet to be determined.

Currently, there are no laws that protect the people's rights of commodifying the human body as a means of exercising freedom of speech; rather, such a public display can be argued as legal as it remains largely uncensored by the state. There are state-based acts that stipulate that an individual should not be enforcedly microchipped in the United States, for instance, but this is legislation that guards against a top-down implementation and does not cover the individual's right to modify one's own body [15]. Additionally, CASPIAN Director Katherine Albrecht had proposed a Bodily Integrity Act in 2007 to prevent the forced or coerced chipping of individuals in America [16]. One thing is for certain, major historical change does not transpire without a radical shift in society's behavior, which is not only reflective in one's thinking and level of acceptance to change but also endorsed by society's collective act of adoption. It is this postmodern preoccupation with remaking the human body, combined with the uncritical acceptance of technological change, that makes the intermingling of human and machine an outward phenomena well worth investigating. Herbert [17] argues that the intermingling of human and subdermal devices is “a social phenomena of technological branding.” The trans-humanist movement, full of high-profile techno-evangelists, typifies this all-you-can-eat technology paradigm to the point that they propose that soon we will all become something other than human, as if being purely human is not enough.

The Right to Govern One's Own Body

Figure 1. Ganesha Balunsat reflects on the need to remove her dermal anchors: “This is my last day with my two beautiful dermal anchors. I will miss them so much! They've lasted a year and a half now, but they have been slowly rejecting, so I will be taking them out tomorrow. As much as I don't want to, it's probably better I get them out now than waiting for them to fall out by themselves and leave a fat scar…” (Courtesy of Ganesha Balunsat, 2010.)

Body piercings (Figure 1), tattooing, and other forms of more radical alterations (Figure 2), such as skin laceration, involve a study grounded within the confines of the sociology of the body [6] and yet extend more broadly to issues of universal human rights as well as international humanitarian laws. It can be argued that the individual is both a social being and a political citizen with certain rights to self-legislate [18]. This juxtaposition places the emphasis on one's own human body as a vehicle to self-determinate, while inadvertently exercising political freedom collectively at an objective level. While social scientists strive to comprehend the signs of the times and endeavor to mark this era as being distinct from any other time in history, the remaking of human identity through technology as a means of the individual exercising political freedom is a clear indicator that we have entered a new cultural era. As well, the degree to which subdermal technologies are being considered for top-down implementation as a means to improve the human race while maintaining social order is another clear historical marker that society's ideological beliefs have radically shifted and modernity has come to a close.

Figure 2. Spaz tacular from Germany takes a photo of a friend's implants. He notes that some weeks later, the ball in the middle of the forearm started to wander downward and got much closer to the left one. He wrote: “Sucks'cause you can't just push it back in place, you know?” (Courtesy of Spaz Tacular, 2004.)

Due to the lack of true public forums (e.g., public referendums), the self is becoming less engaged with the external political world [19]. This is resulting in the individual having an ever-growing fascination with the forming and remaking of one's own identity. Goffman interprets the use of the human body as a type of “sign carrier,” arguing that the way people adorn and present their bodies is how they impart knowledge about themselves to the outside world [2]. Frank argues that this message permeates the level of subjectivity and therefore is not silent. He argues that through the paradoxical interplay between modern society and the speaking body, the polarity between subjectivity and objectivity is resolved. Likewise, as being argued by the phenomenologist, the chasm between these two views can be resolved through an investigation of the manifested collective phenomena [20]. By the talking body, Frank is referring to an understanding of communication as quite literally embodied human cognition and communication that is grounded in the corporeal (physio-logical) experience. Thus, Frank concludes that although our human experience and the way we interpret society is subjective, we all have bodily experiences that are common to each other's and are therefore grounded in a type of objective tangible reality. Hence, such common experiences can be reflected upon corporeally, and in doing so they provide a mutual comprehension of our social world [20].

Although the study of one's individual phenomenon provides us with the subjective perspective, whereby we can still gain knowledge through the investigation of each independent case study, larger quantities of like phenomena can be grouped collectively to look at subcultures more holistically. the provides us with an objective view, which then presents a more macroscopic lens of the way in which the individual is remaking human identity as a whole, through the aid of technology [21]. Although such studies are highly qualitative, their heavy reliance on observation makes the findings highly empirical. Through the study of the manifestation of the physical body and the individual's actions, science can obtain an objective view of an individual's subjective experience on a collective level, which Frank deems as being corporeal.

Modern Primitives and the Rise of Body Modifications

Regardless of whether the self is acting consciously or unconsciously, the remaking of individuality through technology is a worthy subject of study and can provide twofold value—enlightening the social scientist while giving the individual a sense of worth to the embodied experience—placing emphasis on the purpose that spurs the individual toward certain ends [22]. In this sense, it is equally a study of the actions of the one, which can be contextualized within the many when a common denominator is found, where the objective and subjective dichotomy are at least partially harmonized [20].

Klesse also deems the marking of the human body as one such type of phenomena worthy of investigation on a macroscopic level. Presently, sociologists see societal groups that alter and mark their bodies as “a subcultural movement in the intersection of the tattoo, piercing, and [the] sado-masochism scenes” [14, p. 309]. According to Klesse, this modern movement originated in California in the 1970s and has grown significantly over the past decades. One body modifier states, “I am a part of this [modern] culture but I don't believe in it. My body modifications are my way to say that” [14]. Musafar, who was noted as the most prominent of all body modifiers within that scene, coined the term modern primitives as a way to identify himself along with others who alter their bodies as a response to primal urges [23]. Given there are various reasons for engaging in such practices that are highly diverse, it is understood that this subculture comprises multiple communities. Klesse writes [14]:

One of the most significant characteristics of the Modern Primitives movement is their appropriation of “primitive rituals.” In their search for radical corporal, psychic and spiritual experiences and their performance of sexual events and encounters, Modern Primitives seek inspiration by so-called primitive societies through the adoption of their communal rites and body modification techniques.

Such body modifications are viewed as an activity engaged in by consumers as a means to construct one's identity through the transformation of one's own physical capital—the human body [24]. It is here that the social theorist is making an indirect reference to the individual as being an autonomous agent possessing ownership and rights of governance over one's own physical body. This doctrine must be grounded in the understanding that self-determination is limited, in that it excludes the rights of inflicting bodily harm. This distinction needs detailed articulation, due to irrational behaviors (i.e., cutting, pleasure in pain) that can be argued as being a direct result of a psychological disorder [25] and subject to medical prevention. It can be argued that such irrational behavior puts human rights to self-govern in jeopardy. Whether consciously or not, the embodied selves are collectively growing in number, and in this sense, their actions are becoming unified—forming a unified voice of solidarity, crying out, “Enough is enough, this is my life; I have the right to alter my ‘own body’ as I please.” These very rights, combined with the way in which technology is changing the propensity to body alter, are central to this discussion. It addresses whether full governance should be placed with the individual as a type of universal right of self-legislation as being ethically established through critical discourse or whether rights of autonomyremain as they currently are—a matter of the law—determined on a case-by-case basis.

The Momentum of Implantable Devices that Pierce the Skin

Figure 3. Grindhouse Wetware's Northstar implanted in a left hand with its LEDs lit up. The Northstar is a magnet-activated and LED-equipped subdermal device developed by Grindhouse Wet-ware, an open-source biotechnology startup company based in Pittsburgh, Pennsylvania. Grindhouse applies the biohacker ethic to create technology that augments human capabilities. The company is most well known for their Circadia device, a wireless biometric sensor that was implanted into cofounder Tim Cannon in October 2013. (Courtesy of Ryan O'Shea, 2015.)

Between 2014 and 2015, international media covered numerous Internet of Things stories that make this article timely. In April 2014, GroupM's Irwin Gotlieb said that the “Wearable is cool, but the next form of media will be implantable—devices which are implanted in the human body” [26]. Google Director of Engineering Ray Kurzweil concurred that we would have “millions of blood cell-sized computers in our blood stream” within 10–20 years [27]. In June 2014, IEEE Spectrumreported that Medtronic wanted to implant sensors in “everyone” [28]. In November 2014, Peter Diamandis, well-known chief executive officer of the XPRIZE Foundation and cofounder of the Singularity University, got a near-field communication (NFC) implant on a spur of the moment, at the Singularity Summit in Amsterdam [27]. He said in his blog [29]:

Many big companies like Apple, Samsung, and Google are working on technology to measure your biology from outside of your body. Wearable devices ranging from watches to contact lenses will track everything… footsteps, heart rate, blood glucose, blood pressure and other critical vitals. The challenge is that they only work when you remember to wear them, and there are some things you can't measure from the outside. The question is: when would you be ready to start incorporating technology into your body?

Figure 4.

(a) Justin Worst, Marlo Webber, and Jes Waldrip show off a Northstar implant in their hands. (Courtesy of Ryan O'Shea, 2015.) (b) Tim Cannon and Justin Worst greet one another with their Northstar devices, which light up when they come into contact. (Courtesy of Ryan O'Shea, 2015.)

To demonstrate that this thinking about next-generation IT was not isolated to the United States, in December 2014, eight Swedes held an implant party in Stockholm. BBC News reporter Jane Wakefield noted Hannes Sjoblad's hope that his implant party would spark a conversation about our possible cyborg future. He said, “The idea is to become a community; that is why they get implants done together…People bond over the experience and start asking questions about what it means to be a man and machine…Curiosity is one of the biggest drivers for us humans. I come from a maker hacker culture and I just want to see what I can do with this” [30]. In January 2015, it was reported by the BBC that a high-tech office block in Sweden known as Epicenter was granting employees the option to have a microchip implanted under the skin for physical access control to the building, among other functions [31]. In August 2015, Lloyds Bank announced that about 7% of U.K. consumers would adopt microchip implants in their body for making electronic payments [32]. In September of the same year, Kaspersky Labs became intrigued with the security issues related to microchip implants and engaged Sjoblad, chief disruption officer and founder of BioNyfiken (of Epicenter), to their APAC Cyber Security Summit in Malaysia to demonstrate the implantation process [33]. In November 2015, Tim Cannon of Grindhouse Wetware in Pittsburgh, Pennsylvania, launched the Northstar device. While version 1 is limited in capabilities, the Bluetooth-enabled version 2 promises gesture recognition to control remote electronic devices as well as adding patterns or color variations to the existing light-emitting diode (LED) (Figures 3 and 4) [34].

Figure 5. Amal Graaftra, director of Dangerous Things, gets an RFID implant in the webbing of his right hand in early 2005. Graaftra is a Washington state native and business owner who launched www.myuki.com. The photos show (a) The point of implantation and (b) The bandage after the implantation procedure. (Courtesy of Amal Graafstra, 2005.)

This is all while DangerousThings.com has been creating a recognized brand with NFC/RFID implant solutions for biohackers since 2013 [Figure 5(a) and (b)]. Visiting the home page of Dangerous Things, one is greeted by the following messages: “We believe biohacking is the forefront of a new kind of evolution” and “RFID/NFC next level body augmentation.” But most pertinent of all to this article is a statement on the “About Us” page noting, “We believe our bodies are our own, to do with what we want. The ‘socially acceptable’ of tomorrow will be defined by boundaries pushed today, and we're excited to be a part of it” [35].

Drawing the Plumb Line

Klesse states that the signs of the time have been marked by “an unprecedented individualization of the body [where] technological developments, among others, allow for the alteration of the body” [14]. Yet, clearly, it is not just that new technological development is opening up alternatives for body alterations but that the mass acceptance of body modifying practices is shifting the mind-set of the individual to more readily accept skin-embedded technologies. In this sense, there is a conformance transpiring that both is and is not completely lead by one's own free volition.

In Taylor's studies in Hegel and Modern Society, he addresses the notion of being free from external influences. He discusses the question pertaining to freedom by asking if one is truly free when “being motivated by one's own desire, however caused?” [36, p. 3]. Taylor goes on to answer this question by stating, “moral freedom must mean being able to decide against all inclination for the sake of the morally right” [36]. In contrast to the moral relativistic perspective that views happiness as a by-product of fulfilling one's own desire, he writes, “Instead of being dispersed throughout his diverse desires and inclinations the morally free subject must be able to gather himself together, as it were, and make a decision about his total commitment” [36].

Taylor adopts a highly sociological approach and argues that “following the Heideggerian dictum of being-in-the-world,…human beings are already situated in a certain context of cultural meanings; they are embedded in a web of pre-existing and pre-interpreting cultural significance” [37]. Although Taylor argues the need for an objective stance, he in no way supports penal actions for those who have not reached a place of true moral freedom—a place where the self is free from inclinations of the culture in which one is imbued. While this reference helps to aid in determining the distinction that needs to be made, and clearly supports the notion that a certain level of maturity must be in place before an individual can truly exercise proper moral freedom, it in no way supports the notion that freedom of choice should be taken from individuals who lack the capacity to clearly decipher personal motivations as to whether one's decisions are objectively made and free from external influences, once legal age of consent has been met—at least insomuch that it is in reference to one's own autonomous self. Likewise, Baron de Montesquieu advocated against a standardizing of society or leveling of tastes or ideologies through imposed indoctrinations.

Montesquieu wrote [38, p. 54]:

If there were in the world a nation which had a sociable humour, an openness of heart, a joy in life, a taste, an ease in communicating its thoughts; which was lively, pleasant, playful, sometimes imprudent, often [injudicious]; and which had with all that, courage, generosity, frankness, and a certain point of honour, one should avoid disturbing its manners by laws, in order not to disturb its [tranquility].

It is here that we argue that the right to exercise moral freedom must be given both to those who are acting intuitively as well as instinctively, to the extent that one's intuition or instinct aligns with the rights of the one and does not work against the good of the collective in a way that is objectively proven. Human instinct is innate and does not parallel Taylor's notion of inclination but rather in various disciplines such as business is referred to as a gut feeling [12]. This feeling is subjective—often going against all odds—making it distinctly separate from an inclination derived due to calculative thoughts or social influences. Hence, moral freedom—pertaining to adopting body-invasive practices or technologies as well as the refusal of such practices—should not be based on one's ability to articulate the rationale behind one's position whether the individual believes adoption to be right or wrong. We argue that human choice, concerning the right of moral freedom, is not just for the cognitively developed, as intelligence is not limited to academic achievement.

Who Owns My Body when Technologies Invade it?

Figure 6. An X-ray of @_BirdMachine showing her Grindhouse Wetware Northstar, two magnets, and an NFC tag. Her hand is multifunctional as a result of the devices, which can perform as standalone or integrated. (Courtesy of @_BirdMachine, 2015.)

In issues that involve moral freedom, there needs to be a clear distinction—let us call it the plumb line. The distinction that needs to be made is concerning ownership of one's own body in the interchange with body-invasive technologies. Currently, there is a great divide, and so while in our examples we have largely focused on free adopters (i.e., so-named modern primitives, also known as RFIDs, biohackers, grinders, or do-it-yourselfers) who alter one's appearance as a means of conveying information of one's identity to freely exercise self-governance, the line is not being drawn here but rather the plumb line is, by placing individuals under marks of servitude through top-down practices or organizational implementation, imposing an ideology of acceptance that is not one's own. While the first supports freedom to self-determinate, the second leaves no room for moral freedom to be exercised—such as with an outright refusal of accepting changes to one's physical capital.

Various theorists are arguing body modification is a way of constructing one's identity. For example, inserting metal devices under the skin can be seen as a form of resistance to traditional pressures to normalize, by means of challenging the expected norms of society (Figure 6) [24]. However, while modern primitives engage in consumption, whereby they use one's own body as a form of physical capital, it often parallels the Western civilities position of extreme commoditization of external goods [6, p. 305].

It can be argued that rather than the modern primitive taking a stand against repressive systems or resisting expected societal norms, through using one's own body beyond its natural intent, the individual is instead aligning with the linear historical direction of overincreasing rationality that seeks to merge man and technology as a means to eradicate social injustice, whether as a mechanism of control or as a means to maintain social order (Figure 7). This was particularly exemplified when implant proponent Sjoblad told the BBC that his Swedish Biohacking Group had another objective for the Epicenter trial, which was preparing us all for the day when others want to chip us. Sjoblad was quoted as saying, “We want to be able to understand this technology before big corporates and big government come to us and say everyone should get chipped—the tax authority chip, the Google or Facebook chip” [31]. Similarly, when Amal Graafstra was asked in 2007 whether or not he would accept a national ID implant, he replied, “a lot of people ask me…if I am ever going to get my tags removed and I do not really see a reason to do that—unless of course they become oppressive in some way and my particular brand of tags can be used in that [oppressive] system, then I would remove them” [39, p. 448].

Conclusion

Figure 7. Tim Cannon placed the biometric sensor called Circadia 1.0, which he created, under the skin on his forearm. Circadia 1.0 connects by Bluetooth to an Android tablet. The sensor tracks changes in his body's temperature and sends him alarms via SMS if it goes above the normal range. Cannon created the sensor by integrating a Bluetooth connector, microchip, and LED lights. The LED lights up a tattoo on Cannon's arm, under which the sensor is fitted. To insert the device, an incision was made on Cannon's forearm. His skin was lifted and separated away from his tissue, which was not a painless procedure. The device was inserted into the pocket that was created in the subdermal layer, before his skin was sewed up. (Courtesy of Tim Cannon, 2013.)

Through the participation of modern primitivism, it can be argued that the cyborg becomes less alien or sci-fi and more culturally acceptable by a preconditioning of society. Hence, it can be argued that the ancient metaphor becomes present-day reality, while through the very act of adopting body-modification practices, the embodied selves are collectively sowing pillows to the armholes of the people—stripping humanity of the power to evoke change. In this sense (which differs radically from the collective voice of solidarity described above), it can be argued that modern primitive acceptance of body-modifying devices has the potential to inadvertently promote a form of cybernetics that is designed to place humanity at ease—where the individual can easily enter into a state of docility while the governing system acts as the big brother—maintaining social order in exchange for providing a standardization of goods and services. Theorists are already predicting that embedded technologies will be viewed as a user-friendly mechanism to ensure social order, while making unprecedented promises to the general public [40].

If the individual is lacerating the skin by one's own violation, or inserting metal devices to add texture and contour, whether for aesthetic value, sexual appeal, on-body computing, group affiliation, or mere shock value, the motivating factor driving the cultural movement becomes less relevant to that of understanding the direction in which it could be argued as leading the masses. For it is here that it can be sociologically grouped and viewed as an important signifier that draws our attention to other movements, such as cybernetics, and just how the acceptance of body alterations (e.g., lacerations, embedding metals, and sadomasochism) is paving the way. In this sense, body alterations of this nature differ very little in appearance from a top-down cyborg in the form of state paternalism; the necessary distinction is that rights of adoption or refusal remain within the individual's jurisdiction to choose in conjunction to its utility or purpose.

To ensure clarity, it is imperative modern primitive acts not be grouped as a whole. For example, although, sadomasochism and skin laceration are extreme, it is a matter that concerns itself not only with rights of self-governance but due to its nature also can be seen as harming oneself unlawfully. Therefore, causing bodily harm to oneself is distinctly different from, let us say, the act of nose piercing, which has little to no residual effects when it comes to physical harm or being used as a form of social control, other than being seen in ancient practices as a form of being enslaved—an ancient landmark. Regardless of the stance we take on body modifying, the human body is sacred and trespassing without consent is not without serious repercussion—be it a known or the results of an unintentional consequence—it is here, the line is drawn. In conclusion, we are not advocating that the rights of modifying one's own physical body be taken away but rather that lines of distinction must be drawn in order that moral autonomy remain intact. To address theorists' concerns, it is imperative this movement receive ever greater levels of articulation.

Are you addicted?

What is it with us today? We are giving over control to the machine and losing touch with the physical world around us [1]. We are witnessing the decay of our meaningful relationships— sucked into electronic vectors of nothingness—right before our very eyes [2]. Sometimes we are at a loss to describe this phenomenon, reflecting on how members of our own family have been duped by the promise of a Second Life.

It is true that some people are predisposed to different types of addictions—e.g., drugs, alcohol, and gambling—all of which act to curb an underlying condition, usually obsessive-compulsive disorder, depression, and/or anxiety. However, we are now confident that a compulsion toward excessive playing of video games will be added to that suite of newly defined behavioral addictions that need our urgent attention.

This article is dedicated to video game addiction, given its widespread reach, but we would be the first to admit that this is simply one of a dozen types of computer applications that can trigger deep-seated dependencies [3]. Although video game addiction was not included in U.S. Diagnostic and Statistical Manual of Mental Disorders (DSM-V) for 2012, there was an appendix on further research into Internet use disorder [4]. In contrast, the Chinese have already defined the disorder, and some studies have claimed that as many as one-third of mentally ill patients who stay at home are addicted to the Internet [5], [52].

We can point to the increasing number of video game and online Internet detoxification clinics around the world that have been in existence since at least 2005, especially in China [6], [54], South Korea, and Taiwan [7]. From PSs and DSs to Wiis and Zappers, from iPods and iPads to Xboxes, our high tech gaming toys are enslaving children, grandchildren, nephews, nieces, partners, friends, parents, and teachers [8].

Are you addicted to video games?

If you do not wish to admit to the possibility that there is such a thing as video game addiction, then you can just log into one of your avatars in your favorite massively multiplayer online game (MMOG), and, while your terminal is booting, ask yourself a few of the following questions.

▼ How long do you spend on your favorite MMOG each day? [9]

▼ Are you preoccupied by your favorite video game when you are not playing? Is it all you can think about, even at the expense of your closest relationships?

▼ Over the last 12 months, have you put on weight as a result of your gaming habits?

▼ Do you have any friends outside those connected to your online avatar(s)?

▼ Are your grades at school slipping, or is your employment suffering as a result of playing games day and night? Are you suffering from sleep deprivation as a result?

▼ When you are on the computer engrossed in a session of play, do you lose track of time and forget about basic needs like eating, sleeping, or going to the restroom?

Moths are positively phototactic. Cockroaches are negatively phototactic, which means they search for dark spots and crevices. Humans are like moths—they are drawn to the light. But video games can change that. Many gamers who are addicted don't know the difference between light and dark, save for the light emitting from their screen. (Photo courtesy of Wikimedia Commons/Accassidy.)

We visited the Xbox homepage and were confronted with the following message: “A new generation has begun.” Yes, indeed it has. It is the generation drawn to the screen culture, like the moth is drawn to the light. But as soon as the moth touches the artificial light, it is no more. We can say that humans are also prone to the “moth effect.” Like moths, humans are naturally drawn to the light during the day, as opposed to cockroaches, which scurry into dark corners and crevices to avoid detection. We investigate what transforms the gamer—analogously from light to dark phototaxis—and what the ensuing social implications are for him or her and his or her close relationships [10]. How is it, you ask, that the screen emits bright light but the gamer is enveloped in visual and persistent perceptive darkness? Curtains drawn and lights out, the gamer retreats to his or her bedroom to play, and, if engaged in a first-person shooter MMOG, he or she continues to hide in the crevices to avoid being shot. There is a significant body of literature that needs to be studied in relation to target fixation [11] and video gaming. What is it that draws gamers to the console when they know that what they are attracted to has no real tangible benefit?

Just to set the record straight, this generation is not really “new,” as Xbox would have us believe, but about 40 years in the making. It is little wonder that the average gamer is a 35-year-old male [12].

In the beginning was Pong, then came MMOG

Pong interface—the instructions were simple: "Avoid missing the ball for a high score." Pong was akin to a game of ping-pong, but play was electronic in style and versus a computer.

We started off with Atari’s launch of Pong in 1972. The graphics could not be simpler, and, on first viewing, Pong is an innocuous game compared to today’s standards. The instructions were simple: “Avoid missing the ball for a high score” [13]. But what was it about Pong that brought millions of players to the TV screen? Psychologists point to the feedback loop, the anticipation of the response from the terminal, a sense of achievement at gaining high scores, a mastery of sorts over the game, the chance to fill the void with some fun, and a momentary escape from the realities and responsibilities of life.

Then, arcade games for just a nickel a game came into prominence at the same time that video-based poker machines surfaced to draw gamblers [14]. It used to be that Space Invaders (1978), Pac-Man (1980), Donkey Kong (1981), and Mario Bros. (1983) ruled—you used a laser canon to ward off the aliens, you got to eat the pellets and fight off the ghosts and monsters, you gathered ammunition to defend yourself against anthropomorphic enemies, and, as Mario, you got to exterminate the pests threatening to rise from the sewers below New York. But something happened to the nature of the gaming industry after personal computers were introduced into homes. Space Invaders gave way to DOOM (1993), Pac-Man to Grand Theft Auto (1997), and Mario Bros. to Manhunt (2003). Unsurprisingly, the promise of flight- and car-simulator games gave way to war and debauchery. The impact of the rise of the Internet was no different to that which followed the printing press for the production of propaganda and pornography [15].

Sergeant Shane Perry of the 401st Military Police Company displays his new Call of Duty: Ghosts game during the midnight release at the Clear Creek Post Exchange GameStop. (Photo courtesy of Sergeant Cody Barber, 11th Public Affairs Detachment.)

The defining point in the history of video gaming, however, came in 2003 with the introduction of Call of Duty, when a cinematic experience was introduced, tending away from traditional robotic-like behavior by personas. Call of Duty also provided the illusion of a more organic and dynamic game made possible by some clever programming, despite the fact that it still relied on linear scripting. It was much less formulaic than what gamers had experienced with previous first-person shooters. In the same year, massively multiplayer online gaming was touted as having well and truly arrived, as Financial Times measured the per-capita gross domestic product in the EverQuest game to be equatable to that of the 77th-wealthiest nation in the world. This was followed by a large number of subscriptions in the millions with Happy Farm and hundreds of millions with World of Warcraft and, more recently, Minecraft.

On the dark side of the…

The video gaming scene really came of age when the social networking elements of instant messaging, chat, video conferencing features, and presence information were added to the real-world-like online environment. All of a sudden, gaming became a fusion of unified communications that, if misused, could easily appeal to the darker side of the human instinct. Again, we contend that this medium is no different than other entertainment—such as movies with dark themes, music with dark lyrics, or even books with dark messages [16]. But there is something about MMOGs that differs from books, music, and movies [17]. The latter have a beginning and an end, whereas MMOGs seemingly go on forever—a little like the continuous pieces of music on each side of Pink Floyd’s famous album The Dark Side of the Moon (1973). Is there something in this distinction? The flesh is mortal, yet MMOGs carry with them a seeming infinity. You can die a million times over and spawn back to life in a game like UberStrike (2010), but on Earth you only have one life.

We contend that killing people in a game, no matter whether the characters are just animations, cannot be good for the human spirit, that is, the spiritual and mental part of our humanity. Spending a great deal of time, many hours per day, transfixed by high-impact violence (gross and unrelenting), high levels of gore (decapitations, dismemberments, and excessive blood-letting), offensive depictions of cruelty, and prostitution and heavy sexual themes means that we cannot break free from the endless loop. The same argument can be made for any video game that begins to impede our ability to be productive or that affects our ability to take care of fundamental personal hygiene needs [18]. We should be focusing our time toward positive and constructive play with long-term benefits as opposed to spending copious time building a world that does not exist when the power goes off.

We know what some of you are thinking: Not all video games are bad for you. Support your positions with real evidence and scientific studies [53]. Just because I maim and kill online, virtually, it doesn’t mean I’ll do it in the real world, and, if I have virtual sex every so often or even rape a prostitute in a game, it’s not like committing a real physical act. It’s all just make-believe… Who are you fooling? We are building games today in our society that are not only distasteful but are extolling what are generally labeled in the legal domain as cyber crimes against the person [19]. The worrying part is that these violent depictions of everyday life are becoming more callous and entering the mainstream. Are you going to tell us that when you perform these vile acts online that you are actually feeling true love, peace, and joy? Whatever happened to extolling morals and values in our society and to ethical codes of conduct in the game-development industry? It is not even a question of traditional ethics anymore but of plain old common sense.

Yes, yes, having an online affair has naught to do with the real world and has no real-world repercussions. Poof! Smokescreen! What is your heart telling you? What is your body saying to you? Are these acts just our imagination, or are they real, with real-world repercussions? When I spend more time online than with the person with whom I share a bed in the physical world, isn’t there something wrong? [20] When your first thought when you awake is to make contact with your favorite gaming community so that you can go out on another mission and pick off a few more fraggers, you need to reassess your behavior [21]. Things and people around you will start to suffer—how can they not if you are spending 10–15 h glued behind the screen playing? [22] Something must give [23].

Consider the Korean couple who, in 2010, let their threemonth-old baby girl die from starvation as they spent hours devoted to raising a virtual character of a young girl named Anima in the game Prius Online [24]. Think about the case of a young Korean man who collapsed at an Internet café in 2005 and went into cardiac arrest after playing Starcraft (1998) for 50 h straight, of a young Chinese man in 2007 who suffered a heart attack after spending almost seven straight days behind a computer screen (save for restroom breaks) [25], and of two Taiwanese men in separate incidents in 2012 who also collapsed in Internet cafés playing Diablo (1996) and League of Legends (2008) without a break for 40 and 23 h, respectively [26]. Consider the number of wives and husbands who have divorced their spouses over their gaming behaviors [27], especially for infidelity in Second Life and World of Warcraft [28]. And ponder the number of people who have lost their jobs because they cannot work and play video games simultaneously, later moving in with friends and relatives as a result of losing their income [29], [30].

Yes, we know what you’re thinking yet again—that these are just one-off tragic stories, and they’ll never happen to you or your kids [31]. To respond to this, we ask, “really?” It is not difficult to come to the same conclusions we hold. Do your own field observations on your way to work or the next time you are at a coffee shop or on a school campus. How many people, young and old, are absorbed by their mobile phone— immersed in the screen [32]? Parents, it’s time to admit it— there’s a problem with how technology has taken over your family, your workplace, and your headspace. What are you going to do about it? Will you keep believing that resistance is futile? Do you think that you cannot change because you fear that little Johnny will make life hell for you if he doesn’t get his 8 or 9 h online?

Don’t we realize as a technology-reliant community that we are keeping these gaming companies alive by logging in for our kids on 17+ games when they are barely ten years old? What are we willingly exposing them to? We shudder in horror when African dictators enlist ten-year-olds to fight in wars, but we turn around and buy our ten-year-olds the experience of killing far more virtual people than any real war would ever make possible.

When we give in to the demands of our children for yet another video game, we are feeding the darkness in their imaginations and sullying their spirits [33]. What happens when we get so entangled and lost in this virtual world that we do not even see what is happening to our household?

Ask the difficult questions

Stop and ask yourself: Where are your kids? What activity are they engaged in? Are they outside or inside, sleeping or awake? Chances are that every single one of them is behind a console of one form or another, for one reason or another. Now go and do a physical reconnaissance—how many of them are playing games? That has to say something about what we’ve become, and what we hope to become is a question for an entirely different article.

Games like Grand Theft Auto: Vice City promote a host of vices that are contrary to positive societal values

There’s an epidemic of parents failing to care for their kids, to feed them when they are hungry, to change their diapers, make sure they’ve brushed their teeth and gotten enough sleep for the day ahead. There’s also an epidemic of people failing to take care of themselves because they are addicted to electronic gaming or, more precisely, addicted to Wi-Fi. Why is everyone so bent on walking around and deceiving themselves that technology has not pervaded their life with a whole lot of ugly negatives? [34] Why are we all so scared to admit that what we are potentially creating is a road to nowhere? [35] For some, gaming has become a pathological addiction, and they cannot break free from the screen. Is the problem that we, too, are so engrossed by the screen that we cannot lend a hand?

We each know people who treat Facebook (or Instagram, Twitter, or Tumblr) as if they were online games. What’s the difference? Instead of shooting a character in a videogame, we can just like or Tweet some news. It is all “button pushing” and “screen scrolling,” reactionary, and stimulating to the prefrontal cortex, is it not [51]?

Enter locked-in syndrome, a medical term used to describe “a condition in which a patient is aware and awake but cannot move or communicate verbally due to complete paralysis of nearly all voluntary muscles in the body except for the eyes” [36]. Similarly, being in a persistent vegetative state is defined as “a wakeful unconscious state that lasts longer than a few weeks” [37]. We borrow the terms here to question whether society at large is presently undergoing some kind of locked-in technology scenario. In our analysis, addicted gamers go into a comatose state of affairs, where their heart is pumping blood, they are clinically alive, but they are barely conscious. Only their fingers around the console are moving spasmodically, showing us, the bystanders, signs of life in the form of a reflex action. Some even ceremoniously prepare with food and beverage beside them before choosing the addictive stance within which to use the social media elements of gaming.

Have you noticed when you try to talk to an addicted gamer that his or her gaze does not leave the screen, remaining transfixed? If he or she happens to make a mistake while you are trying to talk to him or her, you get blamed for the error in the most extreme way [38]. In the end, that is what the creators of these games want from us: a mind-numbing sense that what they are feeding us is good for our spirit [39]. Many of us, however, would not wish to believe that the primary driver of the gamedevelopment companies is to get us addicted from the start because it means more revenue for them [40]. These games have a spirit behind them, but it’s not one that lifts our souls or causes us to reach for greater things. It is the spirit of the times, that pervasive, uneasy feeling that things are not going so well toward the goodness and natural inclination of life.

This spirit of anarchy or nihilism is leading gamers of violent virtual realities to see and to dream of hells in the games they play as opposed to goodness, to be influenced by the images they see in strange visitations called nightmares, and to ponder demonic thoughts. We do not need to provide you with our evidence, which would only act to pollute your minds. Most players of these abhorrent games will not go out and conduct a massacre in the physical space [41], but surely there are other ways to spend one’s time—outside playing games with the children, admiring natural beauty with a sense of awe, going for a surf or a hike? Why do we choose a psychological prison, trapped not only inside but within ourselves? Those versed in the writings of Carl Jung can take much from here in the context of the “shadow aspect” of our personalities.

The next time you walk past your child’s door (whether he or she is a teenager or an adult living in your home), why don’t you spend ten full minutes together looking at how he or she is interacting with the virtual world through the computer device. In addition, ask about the music your child listens to and the movies he or she gets a buzz from. It is all one and the same: lyrics (auditory), multimedia (visual), consoles (touch and feel) enveloping the faculties of the human body. Immersing oneself in a whole lot of bad stuff is like immersing oneself in a cesspool. The problem with our life today is that we are swimming in the cesspool, surrounded by soft e-waste, and cannot see it for what it is. It is enveloping us on all sides and suffocating our freedom. We can only see things more clearly if we decide to get out of it, wash afresh, and then look with open eyes at what is before us. Yes, this does mean limiting our screen time.

Enacting change

When was the last time you embraced your children or told them you love them in the real world, not just over SMS or e-mail? There’s your challenge—get up off your chair right now, let go of that iDevice, and go searching in the physical space to reach out to that family member right now who is absorbed by their favorite high-tech gadget. It won’t be easy to get him or her to stop and to make eye contact with you, but that’s just the first step [41]. Be patient. It may take several weeks, or even months, but try detoxing the whole family from the dreaded technology that has bound them hand, foot, and mouth [42]. At first, try taking the family away to a location that is a complete dead zone—without even mobile connectivity [43]. Go away for at least a week. When you return, tell yourself you will not go back to your old ways and will hold your ground [44].

The screen is coming closer and closer, and it now seeks entry into the subdermal. Consider it this way: in the last 60 years, we have seen the advent of television, the computer, the Internet, the mobile phone, and now the wearable device that can distort and augment reality itself. What will come next? Will it be a translucent contact lens that completely replaces our actual field of view with another in a pervasive gaming environment? When we cannot make the distinction between fantasy and reality, are we really living? [45]

We must do better [56]. There is still a chance to resist. We can recapture our human rights and our dignity, reaffirm the rights of our children to have undistracted parents, and get back to a time when our children looked us in the eye clearly and brightly when we spoke to them [47]. We still have time to reaffirm the value of reality. We can change. We can do better. We have to remain in charge of the “screen” so that we can not only enjoy the great innovations of our times but also put them to good use.

Conclusions

There are people out there who are not slaves to technology and who are able to play games casually without any ill effect to their health. We are not asking people to live the life of an Amish community [48] but to consider how technology is impacting their home life and start drawing some lines. Children are especially vulnerable [49], but parents are struggling with the same addiction and the same detachment from the real world, staring into a screen instead of sharing real hugs, real smiles, real conversations, real activities, and reality itself with their kids. And it’s not just video games. Answering e-mails, talking on the mobile, texting, Facebook, web surfing, YouTube, all can be equally draining if misused [50].

We would be remiss not to point out the downsides that everyone around us is experiencing. It’s the elephant in the room, the emperor parading naked down the street, the skeleton in nearly every family’s closet. We are calling for people to wake up and admit that, collectively, we have a very big problem on our hands and to begin a thoughtful discussion of how we want to handle it. And lest you think we speak from some lofty, technology-free form of purity, we assure you that, one way or another, we have been down the ugly road we are describing. If we were not challenged by these matters ourselves, we would not be able to speak of them with such passion.

[4] D. Kupfer. (2012). Dig deeper into the new Diagnostic and Statistical Manual, the DSM-V, and you’ll find an entry for “gaming disorder” in Section III, meaning the American Psychiatric Association believes the condition warrants further research before it can be formally classified as a mental disorder. [Online]. Available: http://www.dsm5.org/Pages/ Default.aspx

[13] The Great Idea Finder. (2007, Jan. 16). Fascinating facts about the invention of the Pong Video® Game by Nolan Bushnell in 1972. [Online]. Available: http://www.ideafinder.com/history/inventions/pong.htm

[28] W. Adams. (2008). UK couple to divorce over affair on Second Life. Time World. [Online]. London. Available: http://content.time.com/time/ world/article/0,8599,1859231,00.html (see above for same ref)

[29] T. Lush. (2011). At war with World of Warcraft: an addict tells his story. The Guardian. [Online]. FL. Available: http://www.theguardian. com/technology/2011/aug/29/world-of-warcraft-video-game-addict

Acknowledgments

This article was adapted from “The Dark Side of Online Gaming,” written by Katherine Albrecht, Katina Michael, and M.G. Michael for the International Conference on Cyber Behavior, 18–20 June 2014, Taipei, Taiwan; it was awarded best paper at the conference.

"The Watch is Here" toutes Apple's wearable computer online marketing slogan, implying that the one and only timepiece that really matters has arrived on the scene [1]. So much for the Rolex Cosmograph and Seiko World Timer when you can buy a stylish digital Apple Watch Sport or even an Apple Watch Edition crafted with 18-karat gold (Figure 1).

Of its features and functions, we are told that the Apple Watch is a music player, fitness tracker, communications device, payment token, digital key, and last but not least, a watch that tells the time [2]! We are surprised that no one has claimed that it will also help look after our kids—well, actually they have, just visit the App Store. It seems that this device can do anything.

Figure 2. The first large-scale computer, the ENIAC. An original newspaper headline in 1946 read, “It Won't Mind the Baby—Yet; But Little Else Stops ‘ENIAC.’” Who would ever have believed that this wall-to-ceiling computer could be wrapped around a person's wrist in 2015? (Image courtesy of the U.S. Army.)

Who would have thought that the power of an Internet-enabled laptop computer, mobile phone, iPod, Fitbit, bank card, and set of keys could be neatly packaged and strapped around your wrist (Figure 2)? Images of the 1960s wrist-worn communicator prevail [3]. But unlike the original series of Star Trek, where these communicators could strand characters in challenging situations when they malfunctioned, were lost or stolen, or went out of range, the Apple Watch is being sold as the “all-in-one” solution that you'll never lose because it is always on you.

This begs the vital question: How will we change our behaviors based on the fact that we are walking around with a full-fledged computer that sits in contact with our bodies and communicates wirelessly with machines around us without our knowledge? Apparently, we're all going to look more athletic and stylish, be smarter and more accessible, and have a lot more convenience at our fingertips. But in actuality, we'll be reaching for the mute button, longing to be disconnected, and fed up with all the notifications interrupting us. That's when the novelty effect wears off [4].

We have all witnessed people who cannot resist the urge of pulling out their mobile phone to interact with it at the most inopportune times, or people who pass their idle time simply looking down at a screen [5]. Most do not realize they are interacting with their personal computer devices for hours each day—the repetitive behavior has almost become a type of tic disorder, which is neurobehavioral. We get a message, it makes us feel important, we reply, and get a buzz the very next time it happens again. It's kind of like digital ping pong, and the game can get tangible fast. The primary reason this repetitive behavior continues to remain hidden is because the majority of mobile adopters suffer from this and so it looks normal [6].

People in public spaces can be observed immersed in virtual places [7]. These Wi-Fi-enabled mobile contraptions can trigger a host of Internet-related addictions [8], whether used for gaming, answering e-mail, Web surfing, online transactions, social media, video chatting, or taking photographs. According to experts, Internet addiction disorder ruins lives by causing neurological complications, psychological disturbances, and social problems, not to mention the potential for accidents when people are not looking where they are going or not paying attention to what they should be doing. In short, our need to always be on and connected has become a kind of cyber narcotic drug.

Very few are immune to this yearning for “feedback loops,” so telecommunications operators and service providers pounce on this response. Information is money, and while we are busy interacting with our device, the companies are busy pocketing big money using our big data.

We are fast becoming a piece of digital information ourselves, sold to the highest bidder. And while we are busy rating ourselves and one another, the technology companies are not only using our ratings to learn more about our preferences and sentiments, but rating us as humans [9]. In sociological terms, it is called social sorting, and in policing terms, it is called proactive profiling.

We are fast becoming a piece of digital information ourselves, sold to the highest bidder.In days gone by, mobile communications could tell data collectors about our identity, location, even condition. This is not new, but the real-time access to the precision of this level of granularity of data gathered as technology becomes more and more invasive is something we should be aware of as it potentially can impinge on our fundamental human rights.

Because watches interface with the human body, they have the capacity to tell a third party much more about you than just where you've been and where you are likely to be going. They can

▼ detect physiological characteristics like your pulse rate, heart rate, and temperature, which provides insight into your home/work/life habits [10]▼ determine time, distance, speed, and altitude information derived from onboard sensors▼ identify which apps you are using and how and why you are using them, minute by minute▼ oversee the kinds of questions you are asking via search engines and text-based messages you are sending via social media.

These watches will become integral to the fulfillment of the Internet of Things phenomenon—aiding the ability to be connected to everyone and everything.

All in all, corporations can know what you are thinking, the problems you are facing, and your personal context [11]. What is disturbing is that they can divulge our innermost personal thoughts, intentions, and actions and have some evidence to figure out our reasons for doing things.

Most people who are immersed in the virtual world through high-tech gadgetry are too busy to think about the act of inputting information onto the Internet. People tout a life of convenience over privacy and are therefore not concerned what information is being logged by a company and shared with hundreds of other potential partners and affiliates [12]. Generally, consumers are oblivious to the fact that even if they are doing nothing at all, the smart device they are carrying or wearing is creating a type of digital DNA about their uniqueness.

Today we are asking to be surveilled and are partying in the panopticon. We have fallen in love with the idea of being told about ourselves and don't discern that we have become prison inmates who are being tracked with electronic bracelets.

By the time we wake up to this technological trajectory, it will be all too late. Our health insurance provider will be Samsung [13], our telecoms provider will be Google [14], and our unique lifetime identifier will come from Apple [15]. Of course these are currently the archetypal tech providers, but tomorrow who knows?

And by that time, we will likely be heralding in the age of uberveillance, where we posit that cellphones and wristwatches are not enough, that the human—computer interface should go deeper, penetrating the skin and into the body. The new slogan might read “The Mark is Here,” heralding the iPlant which gives birth to life, the one and only passport to access your forever services.

“You can't live without it,” won't just be figurative, it will be a reality.

Acknowledgment

This article has been adapted from “The Apple Watch Heralds a Brave New World of Digital Living” published in The Conversation on 14 May 2015 and available at https://theconversation.com/the-apple-watch-heralds-a-brave-new-world-of-digital-living-41171.

High-tech robots called packbots [1] will be unleashed during the 2014 FIFA World Cup in Brazil to help boost security and examine suspicious objects [2]. The Brazilian government purportedly spent US$7.2 million to buy 30 military-grade robots from the designer iRobot that will police the stadiums throughout Brazil’s 12 host cities during soccer matches [3].

A PackBot is a hunk of metal with an extendable arm, tactile claw, jam-packed onboard sensors, a computer with overheat protection, nine high-resolution cameras and lasers, and two-way audio [4]. But is it overkill to implement wartime robots at a sporting event?

Sports' History of Violence

On 30 April 1993, then-world number one tennis sensation Monica Seles was stabbed in the back while playing a quarter final match at Hamburg’s Rothenbaum [5]. She was only 19 years old. That incident changed not only the course of women’s tennis but also the face of security in sports [6].

Of course, we can also point to the Munich massacre of 11 members of the Israeli Olympic team during the 1972 Summer Olympics in West Germany in rethinking approaches to the safety of high-profile athletes.

It was Seles’ plight, however, that brought attention to the ever-increasing problem of public figure security. Her stabbing in Hamburg had nothing to do with terrorism but rather was due to her perpetrator’s fixation on Seles’ arch rival Steffi Graf. Player safety was going to become an even bigger business.

It was floated that the Rothenbaum tournament organizers had spent A$650,000 on security, and that Seles herself had employed security guards to protect her at all of her tournament appearances. So what went wrong?

The Human Factor

Not only are people unpredictable but intervention is almost impossible if one cannot anticipate the actions of another. On 13 November 1982, one of Australia’s great wicket takers, Terry Alderman, made a costly mistake when he took security matters into his own hands [7]. The West Australian was disabled for more than a year with a shoulder injury he sustained after attempting to tackle an English-supporting ground invader at the WACA Ground in Perth. Such has become the concern over security that spectators can no longer spill onto the grounds after the final siren to get close to their heroes.

Pitch invasions had long been a tradition of the Australian Football League, and, at the end of matches, supporters could run onto the field to celebrate the game and play kickto-kick with their family and friends. But in recent years, stricter controls have been introduced, and, finally, rushing the field was banned to the great disappointment of fans.

The Nonhuman Factor

What makes PackBots attractive for civilian security situations such as large-scale sporting tournaments? PackBots made their debut in Afghanistan as far back as 2002. During the war on terror, these uninhabited systems had several tasks:

▼ clear bunkers

▼ search caves

▼ enter collapsed buildings in search of life

▼ cross minefields

▼ conduct surveillance [8].

This began a trend of development subsequently in Iraq and other U.S. conflicts, and, recently, they went where no human would want to go, the Fukushima nuclear facility in March 2011 after the devastation of the Japanese tsunami. There are certainly positive uses for these uninhabited systems, which few would argue against.

PackBots can move faster than 14 km/h, rotate 360°, traverse rugged terrain, climb up 60% grades, and even swim in water, as they are able to cope with being submerged up to 2 m. They can even be remotely operated with hardly any lag using a joystick [9].

iRobot’s bots are not recent entries into the commercial market. Many of us were introduced to the domestication of the robot by the company’s Roomba household cleaning machine [10].

The use of electronics in sports is also not new. HawkEye officiates whether the ball was in or out of the sidelines [11], FoxCopter hovers above spectators at cricket matches to give us up-close personal shots of players, and the third umpire adjudicates challenges [12].

But now the PackBots are coming: ostensibly precise, they are not supposed to malfunction or act against the controller’s wishes (or those instructions that they have been programmed with), and they cannot be easily destroyed. In the not-sodistant future, they could use their cameras to observe you, their chemical sensors to breathalyze you, their extended arm to trap you, and their claw to handcuff you.

Have we seriously considered some of the more obvious implications? First, there is the robot warrior at war, then the PackBot at the soccer game, and now the homebot, which we invite into our very households. We are reminded of a famous headline in The New York Post in 1946 in reference to the world’s first automated computer, “It Won’t Mind the Baby—Yet; But Little Else Stops ‘ENIAC’” [13]. The trouble is that, while we are resting our feet in front of our screens, trying to perfect those remote gaming-like controls, or designing killing machines with cross-over household capabilities, we might snooze at the wheel to the real task at hand of creating a more equitable world while still remaining the primary stakeholders in the technological society [14]. The danger is to be caught out, like Seles, by the knife wielding attacker who rushes up from behind. How long will it be before we give up absolute control to these autonomous objects?

How long before we give them additional powers? Will we look forward to societal control on the streets as they lurk waiting for someone to break the law to sustain peace and justice? Will these fearless warriors be deployed on the street to clean up alcoholism, drug abuse, and even the homeless? How long will it be before these PackBots make a run for their own pitch invasion, replacing our mortal athletes altogether because they are able to do mean feats and withstand extreme physical conditions better than those of humans? But what consequence will these anthropomorphic yet mechanized abilities have to being human?

Will being human lack its luster? Is this the beginning of the normalization process that says that man and machine must coexist side by side? Let us in our own time reflect on the impenetrability of Asimov’s three laws of robotics [15]. We are giving over control to machine entities or, better still, objects and units outside of ourselves.

In fact, many argue that we have already lost great chunks of our autonomy without the expected commensurate increase in security. Will the natural instincts and creative inputs of human beings become increasingly redundant in a world where the tin man has the final say?

Acknowledgment

This article has been adapted from the article titled: “War Robots and the 2014 World Cup—Defenders Off the Field” published in The Conversation on 4 March 2014: https://theconversation.com/ war-robots-and-the-2014-world-cup-defenders-off-thefield-23770.

References

[2] T. Mogg. (2013, May 17). PackBot military robots to help with security at 2014 FIFA World Cup in Brazil. Digital Trends. [Online]. Available: http://www.digitaltrends.com/cool-tech/irobot-militarybots-to-patrol-2014-world-cup-in-brazil/#ixzz2wY7iyF92