Hal, is in the house.

A colleague of mine and her Kindergartners were busy exploring where an egg comes from. “Was it born like a baby? Does it grow on its own? Where do they come from? Different perspectives and ideas were shared enthusiastically. The children discussed and challenged each other with their theories. At the end of the activity, one child turned to her partner and said, “when I get home, I’ll ask Siri for the answer.” A routine response in our classrooms? Or an important moment to understand that artificial intelligence (AI) has embedded itself in our day to day lives? For a generation of children who have been raised on iPads and Siri, AI – with a name and voice like a human – is as ubiquitous as any other technology.

AI is a tool that learns, anticipates and predicts. It provides us with instantaneous information or completes routine tasks remotely. The Amazon Echo and Google Home, two new devices that have recently gained traction, have begun to enter the home as personal assistants. The Echo and Home are two of many voice-activated AI assistants that tap into vast artificial intelligence networks. They aggregate information based on our digital footprints and predict our habits based on a learning algorithm that engages continuously with the data we share on our digital devices.

A shift has occurred in our relationship with AI and the impact is profound. It is the seamless adaptation of AI into our lives – a frictionless experience that is slowly making us dependent on this predictive technology. This new relationship meets our unique taste and needs, and only gets better the more it knows about us. Over time, this is changing the way our brain functions when interacting in the digital world. This short video by AcademicEarth.org -“ Cognitive Offloading,” is a reminder of the neurological changes AI is having on our learning. We collectively feel more and more comfortable subcontracting out tasks to AI. The term ” let me google this” is an example.

For educators, this shift is showing up in our classrooms informally and in some instances invisibly. Artificial intelligences are important elements of the devices which exist in our school tool kits. These include mobile devices, apps, browsers, search engines, smartwatches, and more. Writer and professor Jason Ohler asks an important question in his article “Bio-Hacked Students On the Outer Edge of Digital Citizenship”. How should we, as educators, shift the curation of a scholastic experience when students come to the classroom with embedded or wearable artificial intelligences? This alters the value of the commodity of knowledge in the classroom and highlights a potentially new hierarchy where AI supplements a user’s expertise. Suddenly, we have 24/7 access to predictive and anticipatory information which has the potential to disrupt the independent learning experience of a typical classroom. In his article “Artificial intelligence is the next giant leap in education“, Alex Wood reflects on the role AI could play in education.

Coming to terms with these exponential changes takes time to digest. As educators, we need to understand that engagement and critical thinking are vital components of education, especially as AI shifts the classroom narrative. The ethical issues which surround these exponential changes are here now. The complacency that schools engage with in the discourse of what it means to be in a world dominated by AI is a tension we cannot ignore.

As educators, we have a unique opportunity to design curriculums around the narrative of artificial intelligence. We need to be encouraging our students to not only be good digital citizens but proactive digital leaders who understand the complexity of a world fueled by artificial AI. Schools should promote the skills and inquiry mindsets which provide students with the capacity to harness the power and opportunities of AI and not become complacent with the technology. Ultimately, we want our students to be active leaders and architects of AI’s continued growth. As educators, we have a responsibility to ensure our students have a working understanding of how to navigate a complex and changing world fueled by artificial intelligence for the good of future generations.

The reason I’ve done this is because in those specifications, I see a fundamental distinction that is important to teach in any critical analysis of “smart” machines: are the machines collectors of information, or are they creators of it? Are they actually thinking?

The demos I’ve seen of Siri, Echo and Home show that they are not much more powerful than existing text-based technologies that gather, store and process information. They collate in a way that may prove more appealing, and if we are not careful they will infringe more upon privacy. While the facade of these machines provide a more human-like experience, children need to know that the answers they receive from these devices are compiled and derived by the activity of human beings, and that they are (for now) controlled by human agency. Understanding and contributing to so-called “AI” is a digital citizenship activity, as you say, but I disagree that what we are seeing is a move towards “AI,” even if it is the prompt to discuss what it is or how it could potentially impact society.

I agree, as you say, that “ultimately, we want our students to be active leaders and architects of AI’s continued growth.” We have to start, I think, by making sure that students have an accurate picture of what it is, and how they contribute to what it becomes whether or not they are directly involved in its design. Presenting this as “AI” may start us off on the wrong foot.