Reflections on my adventure through life

Technology Education and Social Justice

The following represents my current thinking regarding the need for a revolutionary shift in the way we approach technology education at all levels if we are to achieve social justice outcomes of social change and transformative action. Please take a few minutes to read through it and to provide comments to help us think through further these challenging issues.

Technology education at all levels is inherently political. Feminist writing has strongly influenced the “social shaping of technology” literature within the field of science and technology studies, and argues that technology both shapes and is shaped by society. The feminist perspective on technology has worked to challenge our deterministic understanding of technology merely as artifact neutral and separate from the social, developed by rational, independent technical imperatives (Wajcman, 2009). From the perspective of technological determinism, the independent evolution of technology leads to the immutable molding of society to fit its patterns and efficiencies through a natural selection of social processes that integrate the technology (Ellul, 1954; McKenzie and Wajcman, 1999; Winner, 1986). Technological determinism is closely related to technocentrism, an unwavering faith and focus on technology as the means to resolve social problems. Papert (1987) likens technocentrism to Piaget’s egocentric stage of child development, whereby centrality is given to a technical object in the same way a child has difficulty understanding anything independent of self. These concepts were brought together with individualism, anti-authoritarianism, and neoliberal capitalism in a “Magna Carta for the Knowledge Age” (Dyson, Gilder, Keyworth, and Toffler, 1994) and distributed by the political think tank The Progress and Freedom Foundation. The magna carta was subsequently formalized in the conceptual framework of cyberlibertarianism (Winner, 1997). Kincheloe and McLaren (2009) state that knowledge of the world is socially constructed within specific historical and social contexts that are fundamentally mediated by power relations. Facts are always determined by some degree of ideological inscription. Cyberlibertarianism and its underlying foundational frameworks can be understood, then, as a pivotal factor mediating political relations by becoming a core foundational design inspiration for, and means for reifying neoliberal influences and power relations through, our technologies (Golumbia, 2013). “Where the system of oppression has become institutionalized it is unnecessary for the people to be oppressive (Kennedy, 1970).” Technology education that does not challenge embedded values within technology serves to reinforce those values – it is political either way (An, 2008).

A sociotechnical systems approach brings the social and technical into relationship, emphasizing a connectedness in all phases of the technology lifecycle (Wajcman, 2009; Whitworth, 2009). As such, we shift from thinking about technology as a thing to thinking about technology as a social process (Rhinesmith and Wolske, 2014). For example, Whitworth (2009) explained, “sociotechnical systems are systems of people communicating with people that arise through interactions mediated by technology rather than the natural world” (p. 395). While a focus on technical systems seeks to achieve predictability and control, a focus on sociotechnical systems calls for approaches that identify emergent changes and behavior (Fisher and Herrmann, 2014). Users are seen as co-creators or innovators-in-use, appropriating technologies to fit local contexts, values, and goals (Eglash, 2004; Bruce, Rubin, and An, 2009).

While a sociotechnical systems framework is an important, necessary step in challenging the dominant cyberlibertarian narrative, it is not sufficient. A critical lens complements a sociotechnical systems approach by exposing ways in which technology artifacts are socially constructed, intentionally and unintentionally, to reinforce exploitation, marginalization, and cultural imperialism (Eubanks, 2011; McKenzie and Wajcman, 1999; Wajcman, 2009; Winner, 1986). Critical awareness of the relationship between the social and the technical opens up selection of technical systems that more closely align with personal and community epistemology and ethics (Eubanks, 2011; Zheng and Stahl, 2011). By advancing agency and challenging the exclusive role of the expert professional within the technology lifecycle, it also opens up opportunities to critically consider ways in which the role of women and minorities in technology have historically been hidden or excluded, even challenging the definition of technology as only that created by engineers and computer scientists (Sinclair, 2004; Wajcman, 2009; Eubanks 2011).

Technology education, whether in a pre-professional Library and Information Science (LIS) program, or as part of digital literacy program within a LIS organization such as a library, will reinforce the dominant narrative unless it intentionally and consistently challenges technology as neutral and separate from social influences. An isolated session or occasional reflection question incorporated into an otherwise technical-oriented course, even when incorporating solid progressive and service-learning pedagogy, is inadequate (An, 2008). Consideration of the essential building blocks of technology must be complimented with critical consideration of why those building blocks might be assembled in the way they are because of the cultural, philosophical, political, and economic influences that underlie the design, production, distribution, acceptable use policies, and end-of-life considerations for the given specimen artifact.

In his introduction to sociotechnical systems, Whitworth (2009) describes the evolution in the effective design of innovations that first began when engineers and computer scientists entered into deeper collaborations, and later when behavioral scientists were brought into the conversation. But importantly he points to the sociotechnical gap that exists “between what computers do and what society wants (pg. 395).” The critical sociotechnical systems approach to technology education uniquely prepares people to work within this sociotechnical gap. As libraries increasingly host creative activities such as Makerspaces and Fab Labs, a critical sociotechnical systems approach helps everyday innovators to transform their communities as part of social justice programs. But such a pedagogical approach not only has value for people working in traditional libraries. As corporations move from shareholder capitalism to stakeholder capitalism – thereby advancing the interests of multiple stakeholders including consumers, employees, suppliers, investors, society, and the environment (Mackey, 2011) – people filling the sociotechnical gap by employing a critical sociotechnical systems approach can have a major social justice impact in the corporate realm as well.

References

An, J. (2008). Service learning in postsecondary technology education: educational promises and challenges in student values development. Ph.D. University of Illinois at Urbana-Champaign.

Dyson, E., & Gilder, G. (1994). Cyberspace and the American Dream: A Magna Carta for the Knowledge Age. Retrieved November 4, 2014, from http://www.pff.org/issues-pubs/futureinsights/fi1.2magnacarta.html

MacKenzie, Donald and Wajcman, Judy (1999). Introductory essay: the social shaping of technology. The social shaping of technology. 2nd ed., Open University Press, Buckingham, UK, 1-49. ISBN 9780335199136

Related

Post navigation

7 thoughts on “Technology Education and Social Justice”

I like this point, “Technology education that does not challenge embedded values within technology serves to reinforce those values – it is political either way (An, 2008).” So then, how do we make people (particularly teachers) aware of the embedded values and the effect they have on our society. In short, how do we make people care? Because challenging embedded values is very difficult, so it is so easy to just ignore them, thereby perpetuating them. In my experience, people are afraid of things that challenge their worldview. So it is hard, as a feminist and technology teacher, to tell someone that the way they see the world, or specifically, technology’s role in the world, is wrong. This creates a hostile environment for teaching, so how do we challenge values in a way that is not threatening? This is a challenge that is not unique to teaching technology, but in social justice in general.

Perhaps a possible solution is not to go around telling people their worldview is wrong, but perhaps could be improved. I’ve just had the experience of telling someone that they had just said something racist or sexist and their response was “So?” How do we make people care about making this world a more just place? Then again, that person benefits from our society’s unequal structure, so I guess they wouldn’t be invested in created a society where their privilege might be lessened.

The only solution so far to the “So?” comment is to bring it down to the personal and individual level. Racist and sexist comments hurt me and our feelings. Maybe I can’t make someone care about our society, but maybe I can build a relationship to the point where the harms of sexism (and many other “isms” are real and important because it harms me as a real person who might be important to the offender. I think there might be a connection here to Paulo Freire in that it is important for the oppressed to educate their oppressors (it’s been a while since I read his Pedagogy of the Oppressed).

My connection to the above discussion of teaching technology is the belief in a bottom-up approach. My goal as a technology teacher is to break down technocentrism by teaching a critical approach to technology. If I can foster a community of technology users that challenge the way things are used, designed, or created and have ideas about that can be done better, maybe those ideas can be put into action small-scale in Fab Labs or Makerspaces. Maybe theses ideas and prototypes will work their way up and change the way software developers, computer engineers, and computer scientists think about and create technology. Or maybe the ideas should stay small and everyone could become hackers and individualize technology to suite their needs.