The state does collect some objective, meaningful data to measure the performance of its alternate route programs. The state requires Maryland Approved Alternative Preparation Programs to submit an annual data report that includes principal satisfaction ratings (90 percent or higher is deemed to be as good as or better than other first-year teachers); participants' satisfaction with the training and support received in the program, including their preparedness to teach upon completion; and data from intern supervisors and residency mentors. Maryland also requires that programs move up in their level of program development according to MAAP Guidelines, though the state does not specify any consequences for programs that fail to progress.

However, the state does not collect this data for its traditional teacher preparation programs and only collects programs' annual summary licensure test pass rates (80 percent of program completers must pass their licensure exams). The 80 percent pass-rate standard, while common among many states, sets the bar quite low and is not a meaningful measure of program performance.

Further, in the past three years, no programs in the state have been identified as low performing—an additional indicator that programs lack accountability.

Maryland's website does not include a report card that allows the public to review and compare program performance.

According to the state's winning Race to the Top application, Maryland has made objective outcomes a central component of its teacher preparation program approval process. The state plans to link its Longitudinal Data System with the Educator Information System to identify where teachers received preparation and whether they have been rated "effective" or "highly effective" as measured by student growth. Maryland has articulated that it will publish these data by fall 2013, and by fall 2014, it will use them to improve programs, and close and/or deny approval to those with poor track records. However, there is no evidence to date of specific policy to support these plans.

Performance Criteria
http://www.marylandpublicschools.org/NR/rdonlyres/2C7FFCC4-3F21-4B62-9406-311B06CDF2DB/19746/InstitutionalPerformanceCriteria31109.pdf
Title II State Reports
https://title2.ed.gov
Race to the Top Application
http://www2.ed.gov/programs/racetothetop/phase2-applications/maryland.pdf

Recommendations for Maryland

Collect data that connect student achievement gains to teacher preparation programs. To ensure that programs are producing effective classroom teachers, Maryland should consider academic achievement gains of students taught by the programs' graduates, averaged over the first three years of teaching. Although Maryland has commendably outlined its intentions in its RttT application, to ensure that preparation programs are held accountable, it is urged to codify these requirements.

Gather other meaningful data that reflect program performance. In addition to knowing whether programs are producing effective teachers, other objective, meaningful data can also indicate whether programs are appropriately screening applicants and whether they are delivering essential academic and professional knowledge. Building on the data the state currently collects for its alternate route programs, Maryland should gather data for all teacher preparation programs, such as the following: average raw scores of graduates on licensing tests, including basic skills, subject matter and professional knowledge tests; satisfaction ratings by school principals and teacher supervisors of programs' student teachers, using a standardized form to permit program comparison; evaluation results from the first and/or second year of teaching; and five-year retention rates of graduates in the teaching profession.

Establish the minimum standard of performance for each category of data. Programs should be held accountable for meeting these standards, with articulated consequences for failing to do so, including loss of program approval after appropriate due process.

Publish an annual report card on the state's website. To inform the public with meaningful, readily understandable indicators of how well programs are doing, Maryland should present all the data it collects on individual teacher preparation programs. NCTQ acknowledges that Maryland has articulated a plan to post an annual report card for the public as part of its RttT application. However, to date this plan has not been enacted or codified in state policy.

State response to our analysis

Maryland asserted that it uses the state's teacher standards, the "Essential Dimensions of Teaching" (EDOT), and the national InTASC standards (newly revised and adopted in spring 2011) as alignment tools for program approval. Further, all preparation programs are guided by the "Redesign of Teacher Education," which includes four components: strong academic background, extensive internship, performance assessment and linkage with PK-12 priorities.

Maryland added that candidate performance data are included in all IHE assessment systems. For combined state program approval and national accreditation visits, NCATE standards must also be met. The standards for all of the measures are performance- and outcomes-based, focusing on the quality of teachers.

The state noted that the Title II federal report is an annual accountability report that all programs must complete each year. Providers that fall below the 80 percent pass rate have the information published in a public report on the Title II website. If a program is designated as low performing, a plan is put into place with the assistance of the state to address weaknesses. Maryland added that all institutions placed on this list improved, and that this year, no programs in the state are on the low-performing list. In April 2011, states reported program completers for the 2009-10 cohort in a report to the federal government. The final version of that report was submitted in October 2011.

Maryland pointed out that while MAAPPs (Maryland Approved Alternative Preparation Programs) are able to determine the satisfaction of principals and their mentors with those teachers who are employed in their schools on the Resident Teacher Certificate for the first year of their careers and thus to provide valuable performance information, the same sort of survey would be relatively impossible to accomplish without the ability to effectively track the hiring and placement of graduates from traditional programs. The state added that one of the hallmarks of its Race to the Top work is the development of a system that will allow this. The Teacher Preparation Improvement Plan provides annual collection of data, including PK-12 student performance results of required portfolio projects or action research projects in the Professional Development Schools (PDSs). In addition, Maryland tracks the AYP performance of PDSs, and is currently upgrading efforts to pair lower-performing schools with higher-performing ones.

Maryland noted that it provides extraordinary technical assistance to its teacher preparation programs, using all relevant standards as guide markers, along with the Institutional Performance Criteria of the Redesign of Teacher Education in Maryland, in an effort to assure strong performance. The state's first priority is to extract best performances through data collection and program improvement interventions rather than through implementation of penalties. When those interventions fail to provide the intended outcomes in a timely fashion, penalties are imposed.

Exiting Ineffective Teachers

How we graded

States need to hold programs accountable for the quality of their graduates.

The state should examine a number of factors when measuring the performance of and approving teacher preparation programs. Although the quality of both the subject-matter preparation and professional sequence is crucial, there are also additional measures that can provide the state and the public with meaningful, readily understandable indicators of how well programs are doing when it comes to preparing teachers to be successful in the classroom.

States have made great strides in building data systems with the capacity to provide evidence of teacher performance. These same data can be used to provide objective evidence of the performance of teacher preparation programs. States should make such data, as well as other objective measures that go beyond licensure pass rates, a central component of their teacher preparation program approval processes, and they should establish precise standards for performance that are more useful for accountability purposes.

Research rationale

For discussion of teacher preparation program approval see Andrew Rotherham's chapter "Back to the Future: The History and Politics of State Teacher Licensure and Certification." in A Qualified Teacher in Every Classroom. (Harvard Education Press, 2004).

For evidence of how weak state efforts to hold teacher preparation programs accountable are, see data on programs identified as low-performing in the U.S. Department of Education, Secretary's Seventh Annual Report on Teacher Quality 2010 at:

For additional discussion and research of how teacher education programs can add value to their teachers, see NCTQ, Tomorrow's Teachers: Evaluation Education Schools, available at http://www.nctq.org/p/edschools.

For a discussion of the lack of evidence that national accreditation status enhances teacher preparation programs' effectiveness, see D. Ballou and M. Podgursky, "Teacher Training and Licensure: A Layman's Guide," in Better Teachers, Better Schools, ed. Marci Kanstoroom and Chester E. Finn. Jr. (Washington, D.C.: Thomas B. Fordham Foundation, 1999), 45-47. See also No Common Denominator: The Preparation of Elementary Teachers in Mathematics by America's Education Schools (NCTQ, 2008) and What Education Schools Aren't Teaching About Reading and What Elementary Teachers Aren't Learning (NCTQ, 2006).