Abstract

Spurred by Race to the Top, efforts to improve teacher evaluation systems have provided states with an opportunity to get teacher evaluation right. Despite the fact that a core reform area of Race to the Top was the use of teacher evaluation to provide on-going and meaningful feedback for instructional decision making, we still know relatively little about how states’ responses in this area have led to changes in teachers’ use of these sources of data for instructional improvement. Self-determination theory (SDT) and the concept of functional significance was utilized as a lens for understanding and explaining patterns of use (or non-use) of Compass-generated evaluation data by teachers over a period of 3 years in a diverse sample of Louisiana elementary schools. The analysis revealed that the majority of teachers exhibited either controlled or amotivated functional orientations to Compass-generated information, and this resulted in low or superficial use for improvement. Perceptions of the validity/utility of teacher evaluation data were critical determinants of use and were multifaceted: In some cases, teachers had concerns about how state and district assessments would harm vulnerable students, while some questioned the credibility and/or fairness of the feedback. These perceptions were compounded by (a) the lack of experience of evaluators in evaluating teachers with more specialized roles in the school, such as special education teachers; (b) a lack of support in terms of training on Compass and its processes; and (c) lack of teacher autonomy in selecting appropriate assessments and targets for Student Learning Target growth.

Teacher Interview Protocols

Data Collection Wave I

TELL ME ABOUT YOURSELF: Could you describe your background in education and your current responsibilities? Probes: Years of experience teaching, certification, grade level(s) taught, content area expertise, outside school-related involvement, school committees.

2.

WHAT HAS IT BEEN LIKE TRANSITIONING TO THE CCSS AT YOUR SITE? Tell me a little bit about the transition of your school/district to the Common Core State Standards (CCSS). What has been good about the transition? Where have the challenges been? Probes: Opinions/orientations to the idea ofCCSS(centralization of curricula across states); impact on students; developmentally appropriate practices; training/preparation/PD

3.

HOW PREPARED ARE YOU TO IMPLEMENT THE CCSS? What is the aspect of the CCSS that you feel most equipped to deal with/handle in the classroom? The least? Do you feel as though there is a system in place to support you in addressing this area of concern? Probe the nature of their feelings of support more deeply if necessary.

4.

HOW DO YOU FEEL ABOUT Compass? Moving from CCSS to Compass, the new teacher evaluation system, what have been your initial impressions of this evaluation tool? (Reword: What do you see as the benefits of this evaluation system? What are the drawbacks of such an evaluation system in your opinion?) Probes: Student Learning Targets; Value-Added Measures; observation rubric and scaling (modified Danielson rubric); training and support.

5.

Can you give an example of when you feel the most control over your teaching in terms of who you are as an individual and a professional? Can you give an example of when you feel the least control over your teaching?

6.

Since the beginning of these two initiatives, have you ever found yourself questioning your identity either professionally or personally? Would you mind describing a good example of one these instances?

Data Collection Wave II

1)

(Overall Background) Tell me a little bit about how this school year is going for you right now. What are some successes? Some challenges? (Verify same teaching position as last year/same responsibilities. Probe any new responsibilities.)

2)

(District/School Context) Now that the Common Core State Standards are being fully implemented across the state, what has your school/district done differently this year to incorporate these standards into your curriculum? (Probes: adopt an already-developed system (e.g., Engage NY) or develop your own).

3)

(CCSS Feelings activity) Have your feelings changed towards CCSS now that you have had some time to work with the standards? How would you describe your feelings about the CCSS using an image or a metaphor? Would you mind drawing writing your feelings for me? (Give them a separate sheet of paper with space to draw/write).

4)

(CCSS Support activity) Directions: Provide participant with the Support activity instrument. In the center of the circle the participant is to label “self” or “teacher.” In each of the boxes with arrows pointing toward the center, participants are to write one type of support they are receiving in implementing the CCSS. In boxes that have arrows pointing away from the circle, participants are to write in feelings about the support they are receiving. Finally, interviewer is to ask the teacher to fill in the blank/answer the question “What do you feel is missing in this picture of support for CCSS?”

5)

(Compass Follow-up) Can you tell me about your initial reaction to your overall Compass score? How are you dealing with your rating personally? How well do you feel your Compass evaluation and SLT/VAM score reflect your teaching? Will you share your overall rating with me? (Probes: Student Learning Targets; Value-Added Measures; observation rubric and scaling (modified Danielson rubric); training and support. Collect their COMPASS evaluation score from last year (ineffective, effective emerging, etc.) both overall and the subscores, if teacher willing to share).

Data Collection Wave III

1.)

(Overall Background) Tell me a little bit about how this school year is going for you right now. What are some successes? Some challenges? (Verify same teaching position as last year/same responsibilities. Probe any new responsibilities.)

2.)

(District/School Context) How has your district/school responded to the PARCC test from a curriculum standpoint? (Probes: Are new programs being implemented to better align with PARCC and/or are curriculum materials the same from last year?) Do you feel the conversation this year in your district/school has moved toward PARCC preparation or continued with curriculum alignment to the CCSS?

3.)

(CCSS Feelings) Last time we met we discussed your feelings about the CCSS. Let us look at what you said then (present written metaphor activity to teacher and use the following as guiding questions for unpacking the image). Are you still feeling this way? (Probes: Has the easing of VAM-based accountability changed your attitude toward the standards this year or the way you are teaching? Has the PARCC test added or taken away from your feelings (positive or negative) towards the CCSS?)

4.)

(CCSS Support activity) Last time we met we also discussed the type of support you are getting from your school/district. (Present prior written support activity image to teacher, and use the following as guiding questions for unpacking the image). How would you modify this image to add or take away support items? Have you received more/less support this year? Has the support focused changed as a result of the new testing guidelines?

5.)

(COMPASS Follow-up) Would you discuss with me your informal evaluation from the fall? If you would, share with me an example of how you have used the information from your COMPASS evaluation last year to prepare for your evaluation/teaching this year. (Probes: feelings about using data for improvement; training/support for improvement, collaboration around using data) (Also collect their COMPASS evaluation score from last year (ineffective, effective emerging, etc.) both overall and the subscores, if teacher willing to share.)

6.)

(SLT Follow-up) As they have been emphasized at the state level this year, would you talk a little bit about your experience now with SLTs? How has this increased focus shaped your preparation and teaching this year? (Probe: Have your feelings towards your SLT’s (if applicable) changed? In what ways?; professional development/support; follow-up on results).

American Educational Research Association [AERA], American Psychological Association [APA] National Council on Measurement in Education [NCME]. (2014). Standards for educational and psychological testing. Washington, D.C.: American Educational Research Association.Google Scholar