What Makes Technology Creepy?

Growing up in the 1980’s, I played with a lot of toys and technology. My first experience with a computer was an Apple IIe at school, where I played Where in the World is Carmen Sandiego. At home, my technology-enabled toys consisted of a Speak and Spell that would repeat what I would type, and video game consoles like the Atari 2600 and Nintendo. When I look back, I don’t recall ever thinking these were creepy experiences. The Speak and Spell spoke strangely, but I never thought of it as being alive. And because the Atari and Nintendo were not connected to the internet, there was no real possibility of online communication or safety when it came to the video games I played (although I was told to sit farther away or my eyes would be damaged).

Instead, things that were creepy to me as a kid included haunted houses, the theme song from Jaws, the strangeness of E.T., the possibility of monsters, and stranger danger. Fast-forward to the 2010s: technology is much more pervasive, and new questions have arisen about social media, surveillance, privacy, dark patterns, and artificial intelligence.

Today, I’m the director of KidsTeam UW, an intergenerational co-design team of children (ages 7 – 11) and adults that partner together to design new technologies FOR and WITH children. During my time with KidsTeam UW, my colleagues and I have noticed some trends and patterns in children and their thoughts about toys and technology that seemed different from when I was a kid. I’ve heard children saying things like, “I don’t like that technology, it’s creepy” or “I can’t be with that technology, it’s creepy.” As my team realized that children connect the words “technology” and “creepy”often, we decided it was time to examine exactly what they mean when they say that.

Photo: Kids Team UW

We asked the question: What makes children think certain technologies are creepy but view others as benign?To investigate this question, we conducted four participatory design sessions with 11 children (ages 7 -11) to design and evaluate creepy technologies, followed by interviews with the same children.

What Do Children Fear?We found that children’s fears about the risks of technology fell into two different categories:

Physical Harm. Children in our study often used morbid words such as “kill,” “murder,” and “death” to convey their fears about creepy technologies. They constantly referred to physical harm when describing creepy or unknown technologies. Children described, for example, technologies that try to punish their users and technologies to stalk others and cause harm.

Loss of Attachment. A second recurring fear was that creepy technologies would take them away from their parents or otherwise intrude on relationships with people they love. A child we interviewed stated that creepy technologies take you away from mom and dad. The children in our study had this fear of technology “taking over your life” so that they would be unable to be with their real parents.

What Makes Children Think a Technology is Creepy?
Kids told us that certain signals make technology seem creepy and arouse fears of physical harm or separation from their parents. The six signals that came up repeatedly were:

Deception.The children in our study frequently expressed fears about technology intentionally deceiving them. For instance, one child noted: “Like I’ll say -call Jan Smith [mom, pseudonym] and it [digital voice assistant] will call that person. Okay, it will call them. Then when I ask -will you kill me in my sleep? It says -I can’t answer that.” Here, the child wanted to hear a direct NO from the voice assistant, rather than an “I don’t know.”

Mimicry. Children expressed concerns about technology mimicking them or other people, potentially giving it the power to subsume their identity. Children in our study worried that technology could “steal your identity,” replace you, or take you away from your family.

Control. Children expressed concerns about the ability to control the flow of information, the actions of technology, and its output. For instance, one child expressed that if they could not control Amazon Alexa, the technology would seem creepy: “Yeah, so, it’s like Alexa is in this room and she starts interrupting this conversation.”

Unpredictability. Children explained that systems whose behavior they could not predict—such as a digital assistant that no longer responds to its wake word—led them to worry that something that seems harmless might become sinister.

Ominous Physical Appearance. The superficial look, sound, and feel of a technology is key to how children assess if the technology is creepy. In some instances, children were willing to look past other creepy signals if a technology had a charming appearance.

Mediating between signals and fearChildren referred to their parents as the most important factor in determining whether technologies were creepy or not. One child noted that smartphones, laptops, and other consumer electronics were not creepy because their parents frequently used them without anxiety. In contrast, another child expressed that consumer electronics had the potential to be creepy because his parents put a paper cover over their laptop camera to prevent intruders.

What does this all mean for parents/guardians and designers?Thinking back on my nostalgic stroll through the toys and technologies that I had access to in the 1980s compared to the tech-enabled gadgets that children have access to today, it is clear that technology itself has changed so much that the way children engage and play with tech-toys is different. However, rather than being fearful or trying to avoid technologies and new internet-based toys, it seems that it could be very productive for parents to talk about these topics with their children.

Having a conversation about creepiness and technology may be an important first step into questions of safety, privacy, ethics, and surveillance. These are really big topics for kids to digest. But we know from research that scaffolding conversations with children on the issue of privacy and security is important.

We know from our work that children rely on their parents’ thoughts about technology as a way of making sense of the world. Kids we spoke to talked about their parents’ technology usage and whether it was ok to trust common technologies, like smartphones and tablets, because their parents trusted them.

As a starting point for designing technologies that children trust and enabling children to decide what to trust, we came up with a set of questions that both parents/guardians and designers can think about and discuss together with kids, such as:

What information is okay for people to know about children? Do we think technology could deceive people? Why and how?

What do we want a technology to look like? Do technologies that look and feel nice always act nice?

Could technology ever replace people?

What do you think about parent/guardian usage of technology? How does technology usage in parents affect children’s views of that relationship?

How should technology act with people?

How much control should kids have over technology?

Children today have core fears that have the potential to be invoked by technology, but are overlooked in the design process. We believe that it is essential to understand more deeply children’s fears of technology and how they make sense of the digital world around them.

Jason Yip, PhD

Dr. Yip is an assistant professor of digital youth at The Information School at the University of Washington, Seattle. His research focuses on how the design and implementation of new learning technologies can support participatory learning in science, technology, engineering, and mathematics (STEM) between children and families. He was a Cooney Center Fellow from 2013-2014.
Prior to his life as a researcher, Jason taught K-12 science and math for ten years. He completed both his undergraduate degree in chemistry and masters in science and math education at the University of Pennsylvania. He completed his doctorate in Curriculum and Instruction at the University of Maryland’s College of Education. He is also an affiliate of the Human-Computer Interaction Lab (HCIL) at Maryland.
Jason’s research focuses on partnering with children and families in the design of learning technologies and environments for STEM learning. At the HCIL, he was a member of Kidsteam, an intergenerational and interdisciplinary design team composed of children (ages 7-17) and adult researchers that design new technologies. Through Kidsteam, Jason has worked with a number of collaborations with companies and non-profit organizations, such as Nickelodeon, the National Parks Service, Google, and National Geographic to develop new children’s technologies. For his dissertation, he studied the identity development and science ownership of children in an afterschool program called Kitchen Chemistry, in which children explore science through the development of their own personal food investigations.
Jason’s research has been published in many journals and conferences such as Interaction Design and Children (IDC), the International Conference of the Learning Sciences (ICLS), Computer Supported Collaborative Learning (CSCL), Journal of the American Society for Information Science and Technology (JASIST), and the International Conference on Human Factors in Computing Systems (CHI).