Monday, December 3, 2012

I am more and more convinced that we live in a world of make-believe, a world of make-believe created by the media we use.

Six years ago I started a personal inquiry into why every time I built an art program at a school it was either cut or I was bumped out within two or three years. This journey took me to spending five years as a technology integration specialist, study of school policy, and intense research into the history of public schooling, particularly the history of trends in instructional methods and pedagogy. What I found was something quite troubling. In the past ten years Minnesota has lost over 50% of its fine arts FTEs in schools across the state. How can this be? Could there be other data and statistics out there that would point to the source of this problem? I will come back to this point later.

Last summer it struck me. I have attended and presented at the ISTE Conference twice in the last three years and have followed it closely the years I could not attend. This is the world's largest education technology conference. Now, when I was growing up the computer was a major influence on my intellectual development. I spent hours learning how to program video games and write code. Through writing code I learned algebra, logic, mechanics, and physics in an intensely immersive project-based authentic way. For me the most important aspect of this machine as a learning tool was what it provided as a medium of expression, of creativity, and of engagement. It brought math to life. It also provided me a way to cope with my dysgraphia. But, of the 385 concurrent sessions offered at the 2012 ISTE Conference only four were about engaging students as computer programmers. FOUR! That is just 1% of the sessions at the world's largest education technology conference. What is going on here? For fun, and to illustrate a point, I created this infographic about last summers ISTE Conference:

1% of sessions at ISTE about programming and 50% of Minnesota's art teachers lost, could these statistics have a common root? I believe so. What I left out of this infographic is that when you search the ISTE conference site using the session keyword search engine the number of sessions with the word "create" appear far more than any of the examples I listed here. However, upon closer inspection of these sessions they are all examples of creation as a consumer of participatory media. When we create a web page using a WYSIWYG editor, when we build a Facebook page, when we upload photos to Flickr, when we make a podcast, when we click "like" on someone's status, when we click a link we are feeding the beast. All of these activities produce data.

In Marshal McLuhan's (1964) Understanding Media: The Extensions of Man McLuhan tells us that the "Medium is the Message," meaning that it matters less what content is contained in media than the effect a media has on us. In other words, the effect of the existence of television in our homes is greater than the effect of any content that may be delivered through it. A media has an effect of working us over, making us numb to its nature and leading us to shift our ways of thinking. McLuhan wrote this book at a time when Television was beginning to take over as the dominant media.

Twenty years later Neil Postman wrote Entertaining Ourselves to Death at a time when television had long since replaced newsprint and radio as the dominant media and McLuhan's observations were more apparent. Postman observed that when a society shifts its dominant media from one to another it also shifts what it values as sources of truth. Once it was that “feeling is believing” then “saying is believing” then
“seeing is believing” then “reading is believing” then “deducing is
believing” and now “counting is believing.” Postman argues that it is
the media driven culture that has reduced our concept of what is
believable data to that which can be counted, that which can be
objectified and abstracted. I would argue that it is not the mass media of television or radio that did this, these are analog media but rather the emergence of the computer and digital media.

If you look to rhetorical arguments, campaign propaganda, and advertisements from the 1950s you see a different appeal than you do today. Not a day goes by where we are not inundated with data and statistics as a source to base our beliefs, change our minds, or influence actions but just 60 years ago the use of data and statistics was far less a part of our lives. Instead, the analog electric media of the time produced a belief system that relied on analog sources. A politician was more likely to use a "plain folk" argument or a clever play on words to sway a voter and the advertisements for products focused more on how a product would make you feel.

But today it is data that we look to for our source of truth. The role of data and data visualization have changed and evolved over time. We have become a data-obsessed culture to the point where if we make decisions that are not data-driven or data-informed we are looked to be foolish. In a staff meeting just a couple of months ago a colleague said, "How can you justify a decision like that without collecting data to back it up?" A statement like this is telling on two fronts. First, it negates qualitative data and only focuses on that which can be quantified. Second, it implies that the decision has already been made and that the data we collect ought to back up our decision. This is not data-driven decision making, it is decision-driven data collection. But more times than not what is called data-driven decision making is really a rhetorical device used to justify decisions based on other factors.

The building blocks of digital media is data, it is digits, 1s and 0s and having moved to center stage in the past 10 years as the dominant media it has shifted publicly accepted sources of truth to that which can be quantified. If it can't be counted it is hard to justify it and it is hard for the meta-world created by the data we produce which overlays our real world to see it. Along with this new source of truth comes a charge and desire to "feed the beast;" to produce more and more data. Because, the more clearly the meta-data world represents aspects of our real world the easier it is to manipulate both. Hence the presence of so many sessions at ISTE asking teachers and students to "create" but so few asking them to "program." And, because there is no data-collection method used to evaluate the effectiveness of Minnesota's arts programs many of them get left out of the data-driven decision process. How can you make a decision to keep a program when the law of the land (NCLB) asks you to make decisions based upon data you have collected. Data that is easily quantifiable. But, make a standardized test to evaluate student achievement in the fine arts and you kill it.

I say we are living in a make-believe world. The data=truth pandemic is a grand illusion. This world overlays our real world and we tap into it all the time. Today with a smartphone and an augmented reality app you can scan your neighborhood and access data about physical places and soon you will be able to access data about people. QR codes give us access to the meta-data world associated with objects around us. There have even been calls to create a game layer over the real world. These tools all give us more information that are supposed to help us make decisions but what do they leave out? Are there things in our real world that cannot be digitized or datafied? Are there things that will not make it into the meta-world. And, when we rely on the meta-world for our source of truth what happens to those things that don't make it there?

The big problem is not that data cannot represent aspects of truth. It can. But it can never represent all aspects of truth. It is by nature an abstraction and massive amounts of data rely on visualization techniques to make sense of it. A data visualization is a further abstraction one step more removed from the truth. The more we work with abstraction the more we can manipulate the interpretation of truth. Truth in data becomes truth crafted by interpretation of data which becomes truth crafted by design. Nowhere is this more evident than in the emergence of infographics.

If I gave you a statistic that said homeowners increased their spending on entertainment 11% between 1986 and 2010 you probably wouldn't pay it much mind. But, if I accompanied that statistic with picture of an overweight guy in a stained tank top slumping on a couch watching television and eating potato chips you would interpret that data infusing it with negative associations. However, if I instead accompanied that data with an image representing ballet, the theater, or the symphony it takes on a new interpretation. Data can be manipulated without changing the data.

Marshal McLuhan was cautiously optimistic that if we understand the nature of media we can avoid many of the negative effects it brings. If we understand the nature of the beast we can keep it at bay. At the same time he observed that, "First we craft our tools then our tools craft us," leaving the possibility that it matters not how much we understand the nature of the media, it will likely have its effect anyway.

11 comments:

I agree with your argument, but I think that "creation as a consumer of participatory media" needs more explanation as an idea. If you could elaborate more in this area, then you would have better buy in from a larger group on the basic idea in this article.

Unfortunately, this is really one of the challenges of the digital age. How can we move from scripted creation in a highly restrictive environment to more free-range creation? How can we move from constructing IKEA furniture to carpentry?

I understand what you're saying, and I surely agree that more folks should know how to code. That said, it's only when the tools become available in easier form that the true flourishing of creativity becomes possible, because then average folks - not just specialized experts - can utilize them without steep learning curves.

HTML, blogs, and Facebook are a prime example of this. When we tried to advocate to people the power of having a site on the Web, few took us up on that proposition when it required learning HTML. More took us up on that when it became easier, in the form of blogs. And even more took us up on that when it became even easier, in the form of Facebook. We see similar parallels with other forms of digital creation.

Despite your concern about use of pre-designed tools, we are seeing an ever-increasing flourishing of user-generated content, conversation, and connections, all of which were less possible when the tools were more difficult.

Fair enough. But just because you had the interest to start digging into coding doesn't mean most people do. In fact, we know they don't. And yet they can be quite creative with tools others design and offer. For example, you're not using blogging software you created from scratch. Nor do you likely use word processing software, presentationware, electronic spreadsheets, email programs, and other tools that you made yourself. And yet you use those tools that were made by others in creative and powerful ways. Doesn't your daily practice disprove your own point?

So because I use tools someone else made to write and publish my blog it negates the startling statistic that less than 1% of the sessions at ISTE last year were about teaching students to program? Plus, my point is not that we should only use tools we make from scratch but only that computational thinking is important enough to develop in students that it should not be on the endangered species list. And, I do often make my own tools.

I'm curious to hear from either of you (Scott or Carl) on how Jonassen's Mindtools would fit into this discussion. There are obviously advantages to being able to create something on your own, it's just a balance between what you're asking others to do, and what they're willing to do (i.e. learn how to code).

There are also implications on design model thinking here. Can teachers (and students) be designers without knowing how the programs they're using internally work?

After some reflection on Scott's comment yesterday, I don't think I ever made the case that tools made by others limit creativity. They do place limits on expression, which is true of any media, digital or analog. What I was trying to point out was that with web 2.0 this limit of expression is different because it produces data that can be accessed by the tool maker and most web 2.0 tools integrate data collection and aggregation as a core feature. In many cases this has profound positive advantages but it still fuels the belief that data is the source of truth.

I am only slightly familiar with Johannson's Mindtools. My understanding is that basically he sees the use of technology in a learning environment as a cognitive extension. I think this theory is in perfect alignment with Marshal McLuahn's belief that all technology and all media are extensions of people. Tons of others support this viewpoint as well.

Also, I agree with Scott in that you certainly can design without knowing how all the nuts and bolts work. In fact, you could argue that the web designer who writes all his work in html didn't invent the language or the browser that is used to compile the code, nor did he invent the assembly language that the the browser code gets translated into or the machine on which it renders.

Thanks for providing such nice information to us. It provides such amazing information on care/as wellHealth/.The post is really helpful and very much thanks to you. The information can be really helpful on health, care as well as onexam/ tips. The post is really helpful. Thanks for providing such nice information to us. It provides such amazing information on Law Exams/