PUBLICATIONS

Dedoose has been field-tested and journal-proven by leading academic institutions and market researchers worldwide. Thousands of prominent researchers across the US and abroad have benefited from early versions of Dedoose in their qualitative and mixed methods work and have laid an outstanding publication and report trail along the way.

Students' Perceptions of Characteristics of Effective College Teachers: A Validity Study of a Teaching Evaluation Form Using a Mixed Methods Analysis

This study used a multistage mixed-methods analysis to assess the content-related validity (i.e., item validity, sampling validity) and construct-related validity (i.e., substantive validity, structural validity, outcome validity, generalizability) of a teaching evaluation form (TEF) by examining students’ perceptions of characteristics of effective college teachers. Participants were 912 undergraduate and graduate students (10.7% of student body) from various academic majors enrolled at a public university. A sequential mixed-methods analysis led to the development of the CARE-RESPECTED Model of Teaching Evaluation, which represented characteristics that students considered to reflect effective college teaching—comprising four meta-themes (communicator, advocate, responsible, empowering) and nine themes (responsive, enthusiast, student centered, professional, expert, connector, transmitter, ethical, and director). Three of the most prevalent themes were not represented by any of the TEF items; also, endorsement of most themes varied by student attribute (e.g., gender, age), calling into question the content- and construct-related validity of the TEF scores.
Also cited by Harris, Ingle, Rutledge, 2014, 'How Teacher Evaluation Methods Matter for Accountability
A Comparative Analysis of Teacher Effectiveness Ratings by Principals and Teacher Value-Added Measures.'
Abstract
Policymakers are revolutionizing teacher evaluation by attaching greater stakes to student test scores and observation-based teacher effectiveness measures, but relatively little is known about why they often differ so much. Quantitative analysis of thirty schools suggests that teacher value-added measures and informal principal evaluations are positively, but weakly, correlated. Qualitative analysis suggests that some principals give high value-added teachers low ratings because the teachers exert too little effort and are “lone wolves” who work in isolation and contribute little to the school community. The results suggest that the method of evaluation may not only affect which specific teachers are rewarded in the short term, but shape the qualities of teacher and teaching students experience in the long term.

Focus Groups

Written a long-time authority on focus group, presents a brief history of focus group application up to, and including, information on the variety of current uses across many disciplines. Great section on the uses of focus groups in combination with other methods with a full compare/contrast discussion. Finally, goes into the specifics on ‘how to’ plan and conduct effective group data collection.

Quantitative and Qualitative Inquiry in Educational Research: Is There a Paradigmatic Difference Between Them?

Niglas, Katrin (1999)

Paper presented at the European Conference on Educational Research, Lahti, Finland, September 22-25

Discusses the distinctions between qualitative and quantitative methodological approaches in educational research. Seeks to compare and contrast the characteristics and assumptions of these approaches toward dispelling the notion of paradigm ‘wars’ and in the interest of improving the quality of research in education.

Great discussion and illustration of issues and strategy for establishing reliability in inter-rater coding. Intercoder reliability is a measure of agreement among multiple coders for how they apply codes to text data. Intercoder reliability can be used as a proxy for the validity of constructs that emerge from the data.

The use of qualitative data analysis software has been increasing in recent years. A number of qualitative researchers have raised questions concerning the effect of such software in the research process. Fears have been expressed that the use of the computer for qualitative analysis may interfere with the relationship between the researcher and the research process itself by distancing the researcher from both the data and the respondent. Others have suggested that the use of a quantitative tool, the computer, would lead to data dredging, quantification of results, and loss of the "art" of qualitative analysis. In this study of 12 qualitative researchers, including both faculty members and graduate students, we have found that these fears are exaggerated. Users of qualitative data analysis software in most cases use the computer as an organizational, time-saving tool and take special care to maintain close relationships with both the data and the respondents. It is an open question, however, whether or not the amount of time and effort saved by the computer enhance research creativity. The research findings are mixed in this area. At issue is the distinction between creativity and productivity when computer methods are used.
Computer packages targeted at qualitative and mixed methods researcg data are readily available and the methodology sections of research articles indicate that they are being utilised by some health researchers. The purpose of this article is to draw together concerns which have been expressed by researchers and critics and to place these within the perspective of 'framing' (MacLachlan & Reid, 1994). Here, the focus becomes the frame that these computer programs impose on qualitative data. Inevitably, all data sets are disturbed by the techniques of collection and the conceptual and theoretical frames imposed, but computer framing not only distorts physically but also imposes an often minimally acknowledged frame constructed by the metaphors and implicit ideology of the program. This frame is in opposition to most of the recent changes in qualitative data interpretation, which have emphasised context, thick description and exposure of the minimally disturbed voices of participants.

Scientific Foundations of Qualitative Research

Ragin, Charles C., Nagel, Joane, & White, Patricia (2004)

National Science Foundation Report

Report generated by a NSF workshop on qualitative research methods. Two main sections: 1) provide a general guidance for developing qualitative research project and 2) recommendations for strengthening qualitative research. This report is organized into two major sections — general guidance for developing qualitative research projects and recommendations for strengthening qualitative research.
The intent of the first section of the report is to serve as a primer to guide both investigators developing qualitative proposals and reviewers evaluating qualitative research projects. The second section of the report presents workshop recommendations for designing, evaluating, supporting, and strengthening qualitative research.

Cultural Consensus as a Statistical Model

Romney, A. Kimball (1999)

Current Anthropology, 40 (Supplement), S103-S115.

Discusses history, theory, and strategy for the use of statistical models in the discovery of cultural consensus. Introduces issues related to data collection strategy and the use of empirical data to identify and represent cultural characteristics.

Journal of Mixed Methods Research

Bryman, Alan (2007)

Sage Publications

This article is concerned with the possibility that the development of mixed methods research is being hindered by the tendency that has been observed by some researchers for quantita-tive and qualitative findings either not to be integrated or to be integrated to only a limited extent. It examines findings from 20 interviews with U.K. social researchers, all of whom are practitioners of mixed methods research. From these interviews, a wide variety of possible barriers to integrating mixed methods findings are presented. The article goes on to suggest that more attention needs to be given to the writing of mixed methods articles.

Research in developmental and educational psychology has come to rely less on conventional psychometric tests and more on records of behavior made by human observers in natural and quasi-natural settings. Discusses reliability and generalizability in terms of coefficients that reflect the "quality" of data, what defines quality data, and how reports of agreement are insufficient.