Kindly Log In

Scores on exit exam hold steady

September 19, 2014

September 19, 2014

The latest results of the California High School Exit Exam echoed a similar refrain from past years, with the number of seniors passing the test holding steady year-over-year but showing gains in the number of students who pass in their sophomore year.

About 95.5 percent of the class of 2014 passed the exit exam, matching the record-high passage rate set by the class of 2013, according to preliminary results released Friday by the California Department of Education.

Http iframes are not shown in https pages in many major browsers. Please read this post for details.

Seventy-five percent of those students passed on their first try in their sophomore year – the first year students are eligible to take the exit exam, the numbers show. That’s a 1 percent increase over the number of students from the class of 2013 who passed the test in 10th grade.

The number of students who pass in their sophomore year has climbed since the test was first required in 2006, when only 64 percent of students passed in 10th grade and 90 percent had passed by their senior year.

“Until we shift what we’re measuring to align with what we’re teaching, I don’t know that we can expect to see much more progress,” said Carrie Hahnel, director of research and policy analysis at The Education Trust West.

All California students are required to pass the exit exam in order to receive diplomas. The test is broken into two sections, a math portion that tests sixth- and seventh-grade material and some Algebra I, and an English language arts section that tests up to 10th grade material. Students must pass both sections in order to qualify for a diploma.

Passage rates for African-American, Latino and low-income students continue to lag behind those of white and Asian students, the numbers show, although the gap has narrowed since 2006. About 92 percent of African-American students in the class of 2014 passed the exit exam, up only 0.4 percentage points from the class of 2013. The passage rate among low-income student and Latinos was about 94 percent. In 2006, 84 percent of African-American students and 86 percent of low-income and Latino students passed the exit exam.

Carrie Hahnel, director of research and policy analysis at The Education Trust West, a nonprofit educational equity advocacy group, said the gains among black, Latino and low-income students are “reassuring,” but not surprising, given those groups “had more room to grow” from when the test was first administered.

Of more concern, Hahnel said, are the consistently low results posted by English learner students.

About 81 percent of English learners in the class of 2014 passed the test, a decrease from the number who passed the previous year and up only about 5 percentage points from 2006.

“Twenty percent of English learners are not, in theory, able to receive a diploma,” Hahnel said. “There are issues here. The test is not offered in their primary language, we’re not looking at how long students have been in the U.S. It raises questions about whether we’re offering every opportunity for students to demonstrate proficiency in basic skills and offering every opportunity to graduate from high school.”

Going Deeper

The total number of students in the class of 2014 who passed the exit exam was 417,960, according to the department of education, while 19,679 students did not pass. Past research by the department has shown that the majority of students who do not pass the exam, and therefore don’t qualify for diplomas, are also missing credits required for graduation, a department spokeswoman said.

While the exit exam remains in place as a requirement for graduating seniors, its fate is unclear as the state transitions to a new testing system aligned to a the Common Core State Standards now being taught in California schools. Most other standardized testing has been placed on hold as the state transitions to new Smarter Balanced assessments aligned to the Common Core.

The high-stakes exit exam, however, remains pegged to the previous state standards, and it’s unclear how the mismatch is affecting student performance.

“We’ve reached this place on the (exit exam) where scores are quite high,” Hahnel said, noting a “ceiling effect” on student exam scores over time.

“Until we shift what we’re measuring to align with what we’re teaching,” she added, “I don’t know that we can expect to see much more progress.”

Comments Policy

Tan Towers4 years ago4 years ago

what about adults who want to earn a high school diploma?

Gary Ravani5 years ago5 years ago

There would seem to be considerable evidence available that student's grades are the best indicator of post secondary success as well, presumably, as their readiness to graduate from high school. This has been the case for as long as testing has been done. It is certainly true for the ACT and the SAT. As for the CAHSEE...?
Since teachers routinely assign grades to students they do not entail any extra expense. Some may see that as … Read More

There would seem to be considerable evidence available that student’s grades are the best indicator of post secondary success as well, presumably, as their readiness to graduate from high school. This has been the case for as long as testing has been done. It is certainly true for the ACT and the SAT. As for the CAHSEE…?

Since teachers routinely assign grades to students they do not entail any extra expense. Some may see that as a disadvantage.

See here for one quote:

“InsideHigherEd.com quoted Joseph Soares, professor of sociology at Wake Forest University who has written about standardized admissions tests, as saying in an e-mail:

‘This is important because it is our first national assessment of how well test optional is doing, and the results are solid. As we have experienced here at Wake Forest, which was one of the participants in the study, being test-optional expands opportunity for low [socioeconomic status] youths and minorities of color. The study confirms that high school grades remain the best predictor of college grades; and suggests that anyone relying on test scores reduces the breadth of their applicant pool for no good reason. Test scores transmit social disparities without improving our ability to select youths who will succeed in college.'”

Excerpted from the Washington Post.

As for readiness for “21st Century Skills” and readiness for the “workforce,” I haven’t seen much yet that seems to go much beyond speculation. Is “test optional” in all of its ramifications the way to go?

FloydThursby19415 years ago5 years ago

Gary, grades are very unfair because they don't take into account the school and the competition. Lowell high school in San Francisco has the best of the best, the top 15% of public and many private school kids. A few who could go choose not to because they want high school to be easier or to manipulate their GPA but most kids who can go there do as it's the oldest high school … Read More

Gary, grades are very unfair because they don’t take into account the school and the competition. Lowell high school in San Francisco has the best of the best, the top 15% of public and many private school kids. A few who could go choose not to because they want high school to be easier or to manipulate their GPA but most kids who can go there do as it’s the oldest high school West of the Mississippi, has a Supreme Court Justice among it’s alumni, and really challenges kids to work hard. The average child has 25 hours or more of studying and homework and kids learn the work ethic which enables college success. And I know some including Caroline like to claim that Lowell leads to kids being unbalanced, but the truth is it just limits TV, kids at Lowell date and spend time with friends as much as elsewhere and, contrary to the beliefs of Caroline, Lowell students actually participate in Sports, extracurriculars, arts and volunteer/internship activities more than at any school in the State. It’s detractors love to say the kids are unbalanced but this disproves that.

A 2.8 at Lowell is probably a 3.8 anywhere else. Likewise, a 2.8 at Palo Alto or Fremont Union or Gunn High is probably equal to a 3.8 at Hayward High or even an average high school. Across the state, grades are arbitrary.

Test such as the SAT are morally neutral measures of human goodness. They transcend the luck of which high school your parents send you to. They simply ask, what are your skills, what do you know?

The behaviors we want in our children lead to good scores consistently. If you study more, read more, watch TV less, do SAT prep classes stimulating the economy, read SAT prep books which are available at the library, practice math, you get a better score. All desirable behaviors by teenagers lead to an improved score.

Much of this info is suppressed to make us feel better, but test scores are far more accurate judges of a person’s academic ability, habits, moral values (do they study or goof off before a test) and intelligence than grades. A kid at a bad high school can look great in grades, such as a valedictorian from DC who flunked out of Harvard.

el5 years ago5 years ago

As a high scorer on the SAT, I certainly recognize and have benefitted from its ability to transcend grades and to provide a different look at a student. (IME, a high SAT score tells you a lot about a student but a low SAT score tells you much less.)
But, just because it might be "level" and something you can study for, does not make it neutral. It values specific qualities - like the ability to … Read More

As a high scorer on the SAT, I certainly recognize and have benefitted from its ability to transcend grades and to provide a different look at a student. (IME, a high SAT score tells you a lot about a student but a low SAT score tells you much less.)

But, just because it might be “level” and something you can study for, does not make it neutral. It values specific qualities – like the ability to make decisions quickly, a deep and arcane vocabulary, performance under pressure in a strange environment – while completely ignoring other skills, like the ability to quickly and accurately identify plants, solve differential equations, or write structured and logically complete computer code snippets.

Manuel5 years ago5 years ago

el, please don't give the College Board any ideas.
Can you imagine if they try to design a test to score essay writing? Oh, wait, I hear that they are already trying to do that with computer scoring!
And let's not even get into writing clear and logical computer code. Who is going to decide how much code documentation is good enough? And what language should be given primacy? The old ones or the new ones? No, … Read More

el, please don’t give the College Board any ideas.

Can you imagine if they try to design a test to score essay writing? Oh, wait, I hear that they are already trying to do that with computer scoring!

And let’s not even get into writing clear and logical computer code. Who is going to decide how much code documentation is good enough? And what language should be given primacy? The old ones or the new ones? No, let’s not give them any ideas. They might take them and screw things up even more. 😉

FloydThursby19415 years ago5 years ago

The groups that study the most, read the most, and focus most in class do best. Yes there is some class bias, but on average the better off have better habits, not saying inheritance and nepotism don't play some role in success and the rich are as excessive in drug use as the poor, but poverty is largely a result of people putting themselves first and divorcing. The SAT tests habits. It … Read More

The groups that study the most, read the most, and focus most in class do best. Yes there is some class bias, but on average the better off have better habits, not saying inheritance and nepotism don’t play some role in success and the rich are as excessive in drug use as the poor, but poverty is largely a result of people putting themselves first and divorcing. The SAT tests habits. It tests character. Some test poorly, but they can study.and practice and overcome that, and it’s really a small number of people used to discredit the SAT. A few people have a learning disability, but the SAT predicts success in life well. If you spend the summer reading, not watching shows, if you pay attention in class, if you focus on turning your weaknesses into strengths, if you study 20-25 hours instead of the California average of 5.6, it shows, which is why Asians and other ethnic groups which put more sacrifice and effort into academics reap the rewards on the SAT. It’s like Santa Clause, it knows every choice you make, it can’t be escaped. The kid who shows up and says they’re doing pretty well and are fine, and slacks off, the SAT score will prove that.

el5 years ago5 years ago

But it is a value judgement to say kids should spend their high school years reading to develop an arcane vocabulary versus building robots or volunteering to work with disabled children or helping to build trails in their community. You used the phrase, "morally neutral measures of human goodness." It is not. I am not saying it is wrong to use the SAT, or that doing well on it should not be a goal, only … Read More

But it is a value judgement to say kids should spend their high school years reading to develop an arcane vocabulary versus building robots or volunteering to work with disabled children or helping to build trails in their community. You used the phrase, “morally neutral measures of human goodness.” It is not. I am not saying it is wrong to use the SAT, or that doing well on it should not be a goal, only that we should recognize that there are particular values embedded in the SAT, and that it doesn’t come close to capturing a full measure of “human goodness” or talent, only a tiny corner of it.

Jeff Camp5 years ago5 years ago

It costs real money to develop and administer this 172-item test. The test eats up gobs of student time, which should not be regarded as free). Here's a way to think about the time cost of the exam in rough numbers. Administering the test effectively burns about one full school day (about 6 hours) for each student. The hourly cost of education per student is on the order of $8/hour, so the time cost … Read More

It costs real money to develop and administer this 172-item test. The test eats up gobs of student time, which should not be regarded as free). Here’s a way to think about the time cost of the exam in rough numbers. Administering the test effectively burns about one full school day (about 6 hours) for each student. The hourly cost of education per student is on the order of $8/hour, so the time cost averages to about $48 per student. There are about half a million students in each grade. It pencils out to $24 million worth of student time. http://www.cde.ca.gov/fg/fo/profile.asp?id=3557

For most students, this test is laughably easy — a pure waste of a day. For these students, the test cheapens the meaning of a high school diploma. Education spending and school hours are zero-sum. If allowed, would you pay to administer this test to all the students in your district, or would you do something different?

Doug McRae5 years ago5 years ago

Jeff -- As a test developer guy, I absolutely agree with you. The current CAHSEE design to test 100 percent of the kids in grade 10 is grossly inefficient, both from a student/staff time and a $$$ perspective. But, there is value in having a statewide minimum achievement standard for a HS diploma in California, and for students who do not have the achievement to pass CAHSEE easily the first time it is necessary … Read More

Jeff — As a test developer guy, I absolutely agree with you. The current CAHSEE design to test 100 percent of the kids in grade 10 is grossly inefficient, both from a student/staff time and a $$$ perspective. But, there is value in having a statewide minimum achievement standard for a HS diploma in California, and for students who do not have the achievement to pass CAHSEE easily the first time it is necessary to have some sort of a CAHSEE test. To do away with CAHSEE entirely is throwing the baby out with the bath water. Fortunately, there are ways to use other statewide comparable achievement data to “pre-qualify” kids for a CAHSEE high school graduation requirement, and then not have to administer CAHSEE to all 10th graders, thus saving a bunch of time and money. We should have gone down this path maybe half dozen years ago, but the folks driving the bus haven’t been awake at the wheel on this one.

Don5 years ago5 years ago

Doug, as the resident testing expert, could you clarify something for me as a lay person? Why did California state government decide to postpone SB (though not longer enough) on the basis that and instruction and materials were not yet aligned with the content standards of CC, but did not postpone for an updated CC-aligned CAHSEE based upon the same reasoning? Don’t the same concerns of validity apply in both instances? Are these … Read More

Doug, as the resident testing expert, could you clarify something for me as a lay person? Why did California state government decide to postpone SB (though not longer enough) on the basis that and instruction and materials were not yet aligned with the content standards of CC, but did not postpone for an updated CC-aligned CAHSEE based upon the same reasoning? Don’t the same concerns of validity apply in both instances? Are these the vagaries of the wisdom of government at work or is there some specific rationale being employed whereby postponement makes sense in the one, but not the other?

I realize that a pass/fail test of basic proficiency is crude relative to SB, but how is the cut determined and don’t the instructional changes taking place to varying degrees in the classroom affect how the cut applies when there is so much variation in CCSS implementation? And because the CAHSEE is not CCSS-aligned is it possible that the schools/districts with the slowest implementation might be at an advantage?

Doug McRae5 years ago5 years ago

Don -- Actually, CA has not decided to postpone SB tests . . . . the anticipated date for SB testing has been spring 2015 all along (which was the target date for both consortium test development efforts funded by the feds back in 2010 when the fed funds were allocated). The spring 2015 date was included in the MOU with SB signed by the Gov, SB Pres, and SSPI June 2011. One of the … Read More

Don — Actually, CA has not decided to postpone SB tests . . . . the anticipated date for SB testing has been spring 2015 all along (which was the target date for both consortium test development efforts funded by the feds back in 2010 when the fed funds were allocated). The spring 2015 date was included in the MOU with SB signed by the Gov, SB Pres, and SSPI June 2011. One of the fundamental premises for statewide tests as measures of the results of instruction is that instruction has been implemented, and frankly we don’t have hard evidence that common core has indeed been implemented relatively thoroughly throughout CA for the 2014-15 school year. In fact, while the SBE approved Math textbooks aligned to the CC last January, ELA/ELD textbooks aligned to the CC are not scheduled for SBE approval until Nov 2015 . . . . that suggests implementing CC tests spring 2015 is quite premature. I attribute the failure to take validity of test results into account to be simply politics trumping good statewide testing practices . . . . once a decision was made to join SB, none of the folks who made that decision are willing to re-consider it despite the reality of the instructional situation now in place. I might note the decision to join SB was made by the three officials signing the MOU, no votes were taken at the SBE nor any other level of government.

In terms of CAHSEE, it is a minimal skills test affecting primarily the bottom portion of the achievement distribution in high school. An argument can be made it is not as dependent on common core standards as the full range college and career ready tests being developed by SB. Also, when CAHSEE is converted over to a new test, one of the primary considerations will be continuity of scores from the old CAHSEE to a new CAHSEE. At least that seems logical to me, that a state would want to have some continuity of a high school graduation standard from year to year over time. So, from this perspective I’m not sure there is a need to coordinate implementation of a new CAHSEE based on the common core with implementation of a common core test with wider ranging purposes. Personally, I’d rather take the time to allow the SB tests to mature for a few years, and then look at ways to make the entire program more efficient (there is no need for two separate tests for 100 percent of each graduating class, even if there is a need for a separate CAHSEE test for the bottom portion of the achievement distribution, and then develop a common core CAHSEE replacement test than can operate as a supplement to the SB tests for only those students in need of a separate minimum skills high school exit exam.

I hope these perspectives address the questions you asked.

Manuel5 years ago5 years ago

Doug, do you still feel the same as when you made your comments here?
Indeed, Brown, Torlakson, and Kirst signed up California for the SBs without much input from the public, but isn't that how most decision on testing have been made in the last, oh, 15 to 30 years?
And, to date, nobody has answered el's comments there about technology needs which the LAUSD iPad mess has highlighted for everybody.
Lastly, why is there "a need for … Read More

Indeed, Brown, Torlakson, and Kirst signed up California for the SBs without much input from the public, but isn’t that how most decision on testing have been made in the last, oh, 15 to 30 years?

And, to date, nobody has answered el’s comments there about technology needs which the LAUSD iPad mess has highlighted for everybody.

Lastly, why is there “a need for a separate CAHSEE test for the bottom portion of the achievement distribution?” I am unsure of what this need is, although I have some paranoid ideas. Can you put it that in more obvious terms? Thanks!

Doug McRae5 years ago5 years ago

Manuel --
The link in your first line does not work, so cannot respond . . . .
Re previous decisions on statewide tests, for STAR there were open written submissions from multiple vendors vetted and discussed by the SBE with opportunity for public comments in 1997, 2002, and 2005. Since 2005, the vendor contract has been amended multiple times but without opportunity for competitive alternatives. For Smarter Balanced, there was a 2 hour hearing featuring … Read More

Manuel —

The link in your first line does not work, so cannot respond . . . .

Re previous decisions on statewide tests, for STAR there were open written submissions from multiple vendors vetted and discussed by the SBE with opportunity for public comments in 1997, 2002, and 2005. Since 2005, the vendor contract has been amended multiple times but without opportunity for competitive alternatives. For Smarter Balanced, there was a 2 hour hearing featuring the two federally funded consortia at the March 2011 SBE meeting, just 6 months into their 4 year test development projects so the presentations were focused on promises rather than actuals, with initial public comments on the promises, before the MOU for SB tests was signed in June 2011 without a formal vote by the SBE. The going-forward MOU with UCLA-SBAC approved by the SBE this last July did have a formal SBE vote, but no competitive opportunities were entertained or discussed by the board. My recollection for prior to 1997 was that all statewide test decisions were done via formal competitive RFPs with SBE votes, following state rules for competitive contracts.

Not sure about answering EL’s comments . . . . no link to those comments.

Finally, re the need for a separate CAHSEE focused on the bottom portion of the achievement distribution, all I was trying to say there is that a test like SBAC focused on college/career readiness, even a computer-adaptive test, is not likely to be useful for direct measurement of the minimum E/LA and Math achievement needed for a CA high school diploma. Nothing more sinister than that . . . . a comment on the capability of different types of tests.

Doug

Manuel5 years ago5 years ago

Doug, sorry about that. It appears that the software in this site chose to eat that part. But here is the full URL to your very informative comment of three years ago:
http://toped.svefoundation.org/2011/06/10/california-jumps-to-other-test-consortium/comment-page-1/#comment-31984
Hopefully, it will show now. (El's comment is just below yours.)
Anyway, thank you for informing us that previous adoptions of testing followed a competitive RFP process and has now been turned into a sole-source, no public discussion/vetting one.
That's a shame although it better … Read More

Doug, sorry about that. It appears that the software in this site chose to eat that part. But here is the full URL to your very informative comment of three years ago:

Anyway, thank you for informing us that previous adoptions of testing followed a competitive RFP process and has now been turned into a sole-source, no public discussion/vetting one.

That’s a shame although it better defines the responsibility when the problems will eventually surface. The children, of course, will have to bear the consequences.

Incidentally, I did not mean to imply that there was something sinister in characterizing the CAHSEE as the test for the bottom half of the distribution, but having such a test speaks volumes about the impossibility of having all, not some, but all students meet proficiency in any test, be that the CST or the SB. The fact that there is a recognition that there is a need to test for the minimum required to get a high school diploma and one that is supposed to measured readiness for college/career readiness is not sinister, but merely reflecting a reality that many of our education politicians and education “advocates” refuse to acknowledge.

(FWIW, I share your position on this issue: adoption of the SB should have taken place only after a vigorous public discussion. I expect that this will have repercussions as this will reflect on the upcoming battles over the de facto imposition of curricula for high schools across the nation by the College Board through its revision of the AP tests. This is already playing interestingly in Jefferson County, Colorado, where the argument has been framed in tea-party-like sound bites. May we avoid such fate.)

Doug McRae5 years ago5 years ago

Manuel -- Thank you for your links to my comment when CA joined SBAC three years ago and to EL's following comment. You have a better memory than I do . . . . .
After re-reading my comment, I haven't changed my tune at all . . . . didn't see anything in that extensive comment that I would modify. And I'd still agree with EL's comment also. The technology piece for computerized … Read More

Manuel — Thank you for your links to my comment when CA joined SBAC three years ago and to EL’s following comment. You have a better memory than I do . . . . .

After re-reading my comment, I haven’t changed my tune at all . . . . didn’t see anything in that extensive comment that I would modify. And I’d still agree with EL’s comment also. The technology piece for computerized statewide assessments has actually progressed faster than I anticipated, at least apparently so based on the technology successes this last spring, but substantial equity questions still remain to be addressed. And non-technology issues focusing on having valid reliable fair scores from new computerized statewide tests haven’t been addressed hardly at all over the last three years. I’ve expressed the view many times in public forums over the past several years that spring 2018 would be the earliest time CA should expect to have valid reliable fair computer administered statewide assessment results . . . . I still feel that way, tho the recent technology successes do have me thinking that spring 2017 is a decent possibility. Stubbornly staying with the spring 2015 commitment from 3 years ago is head-in-the-sand strategic planning . . . .

CarolineSF5 years ago5 years ago

The headline is inaccurate — it’s not the scores (which aren’t discussed here and as I understand it aren’t revealed) but the passage rate that’s holding steady.

Doug McRae5 years ago5 years ago

Caroline — The primary “scores” for CAHSEE are pass/fail, so the headline is accurate. There are other scores for the CAHSEE that are used for other purposes (such as federal accountability), but the post deals with the primary purpose for the CAHSEE which is whether students have the minimum achievement needed to qualify for a high school diploma in California and for that purpose the scores referenced are pass/fail.

CarolineSF5 years ago5 years ago

Fair enough. The implication is that we (the public) actually know what the score levels were, which isn’t the case.

FloydThursby19415 years ago5 years ago

Everyone’s score should be made public. This will make kids work harder in school. Anything which makes kids study more hours and goof off fewer will pay huge dividends for America, smarter working, smarter voting, etc.