This past semester, I decided to take a MOOC (Massive Open Online Course) from Indiana University on Information Visualization which, besides offering great coverage of current visualization conventions in academia and publishing, also featured a module on their proprietary software Sci2.

I think it was a great opportunity to both learn the material and experience a distance, team project. My group worked on building a visualization of the history of psychoanalysis.

Working Paper from Media and Memory, taught by Dr. Katya Haskins, Spring 2014.

Abstract: Public memory sites, like all civic institutions, are governed by asymmetrical power relations: the public learns and experiences the vision of those in control of memorials and museums. Critical scholars and advocates, however, seek to re-balance knowledge relationships using emancipatory techniques and technologies that support and value the voices of previously silenced people. Mammoth Cave National Park is a public memory site where a team built a participatory GIS system to collect the stories and memories of families evicted from their ancestor’s lands in the 1930s. This paper will use the Mammoth Cave Historical GIS project as a case study to explore the efficacy of GIS as a memory platform. Relying on theory from rhetorical analysis, memory studies, and the digital humanities, this case explores the rhetoric, politics, and ethics of using participatory GIS for critical memory work. Ultimately, several provocations emerge which challenge the ideal of a GIS system as an inherently emancipatory technology. Despite the goals and rhetoric of critical geographers, participatory GIS still produces a master narrative as a result of its cost to implement, technical form, and aura of objectivity.

Abstract: “Big data” can never speak for itself; large data sets are constantly interpreted during the collection and research process. This case study illustrates the utility of data visualization techniques for “reading” data sets to produce appropriate data stories. By being aware of the interpretive nature of computational methods, visualization tools are useful for finding structural differences in data sets as a result of various collection methods- in this case, “gamification.” Using two data sets from the research site GiveALink.org, these differences are enumerated as part of describing the interpretive data visualization process.

In this piece, “How surveillance changes behavior”, Steve Lohr is reporting on NCR Corporation’s Restaurant Guard and the research report “Cleaning house: The impact of information technology monitoring on employee theft and productivity” by Lamar Pierce, Daniel Snow, and Andrew McAfee. To begin his article, Lohr summarizes current news coverage of surveillance in America, with issues ranging from the Edward Snowden and NSA scandal to the Mayoral debates in NYC. He describes these stories as coalesced around competing narratives of reassurance or invasive technology. Lohr then points out that few news stories address how technological surveillance may actually change behavior, not simply record it. While the issue of behavioral changes due to recording technology has been discussed previously in academic literature, this research report is the first to take a large data set and quantify that change as a monetary figure, something that restaurant owners and corporate investors are invested (literally) in knowing.

The experiment works as follows: the NCR software was installed at 392 restaurants from 5 common casual dining chains (restaurants on the same tier Chili’s) in 39 American states over a period of several months. The records were kept for two years. The software monitors sales transactions to uncover employee-level theft from two primary factors: staff incorrectly charging customers too much, or staff “comping” meals and keeping the difference if customers paid in cash. Managers or owners only receive a notification if a transaction appears to be undeniably suspicious, a “clear case of misconduct”. Lohr explains that these small scale transactions are actually a huge problem, amassing to at least 1% of restaurant revenue in a business that averages only a 2 to 5% profit margin annually. Twenty percent of a restaurant’s projected profits are enough to put many businesses out of business.

Abstract: Video games are an example of a tethered appliance. Starting with the latest generation of consoles (Xbox One and Play Station 4), service providers can make minute and remote changes to the gaming platform and simultaneously collect millions of pieces of user data. This “tethered” part of the new consoles sparked massive debates among American gamers concerned about their privacy and consumer rights, and it led to some allowances from the company, such as the removal of their “always on” stipulation. After a careful consideration of the video games industry and video game culture, this paper will argue that big data analytics are changing how consumers interact with service providers, which creates a set of potential problems. This industry’s use of big data analytics reveal patterns which affect society more broadly. Big data can provide positive contributions to gaming by inspiring new opportunities for player agency, expanding game mechanics, and allowing novel narrative moments. Yet, games are also a place where big data’s flaws are exposed, from abuses in player profiling, violations of privacy, and the unequal access to information. By discussing and uncovering these emerging problems, the implementation of future analytic technologies can be monitored by consumers and users to protect their interests in the political struggle over data and information.

Numerical power, typically understood as strength of certainty in statistics or as a functional property in mathematical contexts, takes on alternative semiotic meanings when applied discursively in contested, lived spaces. This paper discusses the political instrumentality of numerical descriptors for particular types of cultural studies projects, a discipline with a decidedly political foundation which is dedicated to uncovering and revealing naturalized power structures. Beginning with a discussion of why quantitative methods are historically ostracized from cultural studies, a traditionally qualitative discipline, this paper then addresses three ways data analysis can benefit cultural studies. These three roles include the contribution of magnitude, the construction of models as metaphor, and the selection of dialogic sites for critical qualitative intervention. To demonstrate this tri-parte schema which complements various methodologies, this paper reimagines several canonical pieces from political economy (Gibson), psychoanalysis (Lacan), ethnography (Ma and Cheng; Geertz), and digital technologies (Deuze). Ultimately, these are potential areas to expand the rhetorical force and dialectic appeal of some projects, but it is not for every project; these three roles are always situationally contingent.

Working Paper from New Media Theory, taught by Dr. Jim Zappen, Fall 2013.

Abstract: The Pope is on Twitter! The President is on Reddit! What happens for a reader when traditional and powerful people and institutions engage with online communities in their vernacular social spaces? This paper expands on previous research, Lanius, (2011), “YouTube Commentary,” which addressed speech community formation across YouTube, by taking descriptive parameters from that piece and demonstrating the implications for community speech mode formation today. With interactive features saturating the web, certain phenomena are observable as fairly stable processes of socialization and enculturation which bound the inside, outside, and politics of online communities. This project asks, what happens when two sets of expectations and communicative practice collide? Online communication forums, including commentary threads, twitter feeds, blogs, etc., have demonstrated their capacity to form groups of like-minded individuals who coalesce around common causes or ideologies. Limited research has addressed how ‘foreign’ texts are inserted into these spheres, frequently presenting expectation bending communiques which must maintain their professional veneer while simultaneously addressing an audience contrived and imagined from the website’s speech community. In these moments, what are optimum strategies for readers to decode the message? Through a content analysis of a Reddit AMA with President Barack Obama and the twitter feed of Pope Francis, this analysis reveals the conflicted and argumentative conversations present under authoritative texts where speech community members debate and decide how they should receive the message. Such an analysis enables scholars and communication specialists to re-theorize media reception in disparate environments and also prepare for the composition and construction of future communication events involving traditional authority and internet communities.