Survey researchers commonly assume that people know what they do, know what they believe, and can report on it with candor and accuracy, as Angus Campbell put it. From this perspective, many findings suggest that survey respondents are less than candid. The best known example is the observation that answers to racial attitude questions vary as a function of the interviewers race. Challenging this interpretation, a large body of social psychological research shows similar context effects under conditions that do not lend themselves to this interpretation, including conditions that use implicit attitude measures, which are not subject to deliberate “faking”.

From a situated cognition perspective, such findings reflect that attitude questions assess context sensitive evaluations that respondents form on the spot, drawing on information that is accessible at that point in time. The underlying processes operate in daily life as well as in survey interviews and reflect the situated nature of human judgment rather than a deliberate attempt to report a socially desirable answer.

I review relevant findings and discuss their implications for survey measurement.

I suspect that oftentimes fear of big data is motivated by a concern that new, less tested, still evolving methods will replace the time tested methods that we have grown to have so much faith in. I sincerely believe that the foundation that we have is a strong one, and the knowledge we have developed through those processes should be embraced, especially the quality controls. But SUPPLEMENTING an analysis through a measured combination of data sources can lead to a more complete picture.

This week I spent some time analyzing Pew’s report on the Kony 2012 video. I believe that this report is an excellent example of what researchers are capable of when they look outside the artificial divisions of research group (this was a collaborative effort) and research methodology. Seven days after the release of the video, Pew was able to reconstruct a comprehensive narrative of the video’s dissemination, using traditional survey methods, sentiment analytic snapshots over time, and a careful breakdown of the media coverage of influential parties.

Dana Boyd also has an interesting analysis of the Kony phenomena on her Apophenia blog:

This book looks fantastic. Whenever I need to do a lot of thinking at work, I’ll go for a walk or hit the gym. Or start reading about a similar topic. Or stare out the window. We don’t have Ping Pong tables, but we do have floor to ceiling windows overlooking a wooded patch. I can’t tell you how any cumulative hours I’ve spent watching the trees wave in the wind and working my way through a stumbling block.

I have a Zen calendar on my desk for 2012. It has such gems as: “Although the world is full of suffering, it is also full of the overcoming of it” (Helen Keller)

The more I look at the calendar, the more it relates to everything I think about.

I read “To see is to forget the name of the thing one sees,” (Paul Valery) and I think of the Charles Goodwin paper I cited in a recent post about Professional Vision. He talks about ways of seeing as kind of coding structures, inculturation, or ways of foregrounding certain parts of what we see. Truly, being able to see deeper than that requires shedding that inculturation and observing more closely. As researchers, we often become so deeply incultured into our way of thinking, that we lose sight of our research goals. As survey researchers, we can easily fall into the pattern of first asking “who should we survey?” and “what should we ask?” before taking more time to consider whether a survey is even an appropriate methodology for the specific topic of focus. Of course, not this action based on praxis is not limited to survey researchers. Far from it! Every person, every field, every community of practice, every language has a way of thinking. And often instead of seeing or observing, we quickly begin to navigate our networks of inculturation.

These two are similarly meaningful in my interpretation:

“Zen is not to confuse spirituality with thinking about God while one is peeling potatoes. Zen is just to peel the potatoes.” (Alan Watts)

“If all beings are Buddha, why all this striving?” (Dogen)

These are a reminder to boil things down to what they simply are and not try to describe them as what you want them to be. In survey research, this comes up often in the process of reporting research results. If I know that I intended to measure something about Project Based Learning or STEM education, it is easily for me to begin to frame my findings by my intentions. But that is not true to my findings or my methodology, and it doesn’t make for good research. I can’t say that 10% of my respondents were using project based learning methods in the classroom if I asked about the number of group activities they conducted. I must simply say that 10% were using group activities (daily/monthly/occasionally- whatever the answer choices were)

In this way, my Zen calendar not only provides something to think about in a larger sense, but it keeps my research anchored.

This piece is a nice reminder not only, as the authors conclude, that sentiment analysis has not fully matured, but also that sentiment analysis and social media analysis probably don’t accomplish what they think they are accomplishing: