Post navigation

In Response to a Tennessee Assistant Principal’s Concerns – Part I English/Language Arts

In our most recent post, an assistant principal from Tennessee expressed his concerns regarding the state’s value-added scores, as measured by the Tennessee Value-Added Assessment System (TVAAS; i.e., Tennessee’s iteration of the EVAAS model). He wrote about an apparent relationship he noticed between value-added scores and certain grade levels and subjects. For example, he called specific attention to the high value-added scores in 4th and 8th grade English/Language Arts (ELA) and the low scores in 6th and 7th grade ELA. Noticing a similar pattern across districts, he expressed his main concern:

Were these the results of a few schools you might assume it was a pedagogical issue. When you see this consistent pattern across thirty and forty schools it causes me [now as an Assistant Principal] concern to evaluate teachers with this tool.

After a quick skim of the data (which you can do here), my doctoral student, Jessica Holloway-Libell and I were moved to conduct a more thorough analysis. We analyzed data from the 10 largest school districts in the state of Tennessee based on population data (found here). We looked at grade-level value-added scores for 3rd through 8th grade ELA and mathematics (see the mathematics analysis forthcoming in another post). For each grade level and subject, we calculated the percentage of schools per district that had positive value-added scores for 2013 (see Table 1) and for their three-year composite scores, since these are often considered more reliable (see Table 2).

We wanted to know how likely a grade level was to get a positive value-added score and to see if there were any trends across districts. Not surprisingly, and thanks to the aforementioned assistant principal for bringing it to our attention, there certainly were some striking trends. For clarity purposes, we color-coded the chart—green signifies the grade levels for which 75% or more of the schools received a positive value-added score, while red signifies the grade levels for which 25% or less of the schools received a positive value-added score.

Table 1: Percent of Schools that had Positive Value-Added Scores by Grade and District (2013)

District

3rd Grade

4th Grade

5th Grade

6th Grade

7th Grade

8th Grade

Memphis

41%

43%

45%

19%

14%

76%

Nashville-Davidson

NA

43%

28%

16%

15%

74%

Knox

72%

79%

47%

14%

7%

73%

Hamilton

38%

64%

48%

33%

29%

81%

Shelby

97%

76%

61%

6%

50%

69%

Sumner

77%

85%

42%

17%

33%

83%

Montgomery

NA

71%

62%

0%

0%

71%

Rutherford

83%

92%

63%

15%

23%

85%

Williamson

NA

88%

58%

11%

33%

100%

Murfreesboro

NA

90%

50%

30%

NA

NA

Table 2: Percent of Schools that had Positive Value-Added Scores by Grade and District (Three-Year Composite)

District

3rd Grade

4th Grade

5th Grade

6th Grade

7th Grade

8th Grade

Memphis

NA

54%

54%

46%

17%

98%

Nashville-Davidson

NA

70%

48%

50%

36%

100%

Knox

NA

77%

77%

43%

14%

93%

Hamilton

NA

75%

55%

29%

33%

95%

Shelby

NA

72%

82%

25%

38%

88%

Sumner

NA

69%

58%

25%

25%

92%

Montgomery

NA

85%

95%

14%

0%

86%

Rutherford

NA

96%

29%

54%

31%

92%

Williamson

NA

96%

91%

33%

78%

100%

Murfreesboro

NA

90%

90%

90%

NA

NA

As you can see, in 2013 schools were, across the board, much more likely to receive positive value-added scores in 4th and 8th grades than in other grades. On the other hand, districts struggled to get positive value-added scores for their 6th and 7th grades in the same subject. Fifth grade ELA scores fell consistently in the middle range, while the third grade scores varied across districts.

Similar results were found for the three-year composite scores, except even more schools received positive value-added scores for the 5th and 8th grades. In fact, in all nine districts that had a composite score for 8th grade, at least 86% of their schools received positive value-added scores for this grade level, while two districts had 100% of their eighth grades receiving positive scores. This is in stark contrast to the 6th and 7th grade composite scores, where only one district for each of these had a majority of their schools receive positive composite scores.

The most important question is, what does all of this mean, particularly for the validity of the inferences that are based on these (skewed) data?

The unlikely answer is that, on average, 4th and 8th grade ELA teachers across the state are more effective than the 6th and 7th grade ELA teachers and thus have earned higher value-added scores and the glory that comes with that.

Or, the more likely answer is that there is a level of bias in the TVAAS estimates, whether it be due to the tests or the curriculum or the scale scores (e.g., not being vertically equated) or some other culprit that is indeterminate (but also adding to this mess) at this time.

Regardless, it is quite clear that 4th and 8th grade students are more likely than the 6th and 7th grade students to yield higher growth scores on the TCAP (i.e., the state standardized test upon which TVAAS scores are determined). And in the state of Tennessee, this growth is being grossly attributed to the teachers of students in these grades. Once again, everyone should take great caution before considering these scores as true measures of teacher effectiveness.

Follow "VAMboozled!"

The views expressed herein and throughout all pages associated with vamboozled.com are solely those of the authors and may not reflect those of Arizona State University (ASU) or Mary Lou Fulton Teachers College (MLFTC). While the authors and others associated with vamboozled.com are affiliated with ASU and MLFTC, all opinions, views, original entries, errors, and the like should be attributable to the authors and content developers of this blog, not whatsoever to ASU or MLFTC.