Transcription

1 Power Calculation Using the Online Variance Almanac (Web VA): A User s Guide Larry V. Hedges & E.C. Hedberg This research was supported by the National Science Foundation under Award Nos and Additional support was provided to Hedges by the Institute of Education Sciences under Award No. R305D Any opinions, findings, and conclusions or recommendations expressed here are those of the author(s) and do not necessarily reflect the views of the National Science Foundation or Institute of Education Sciences. Center for Advancing Research and Communication in Science, Technology, Engineering, and Mathematics (ARC) NORC at the University of Chicago 1155 East 60 th Street Chicago, IL 60637

2 Table of Contents Page Introduction 1 The Web VA 3 Using the Web VA 4 Stata program for computing power using Web VA design parameters 8 Example power analyses 8 A design with no covariates (unconditional model) 9 A design using pretest as a covariate 10 A design using demographic variables as covariates 11 A design using both pretest and demographic covariates 12 Exhibits Exhibit 1: Surveys providing values for the Web VA 3 Exhibit 2: Opening page of the Web VA resource 4 Exhibit 3: Filter selection page of the Web VA resource 6 Exhibit 4: Sample output from the Web VA 7 Exhibit 5: Design parameters based on a national probability sample of Grade 8 12 Mathematics Achievement from All Schools in The National Education Longitudinal Study (NELS). (All Regions, All Urbanicities) Exhibit 6: Stata output from the RDPOWER program based on parameters for a 9 design with no covariates Exhibit 7: Stata output from the RDPOWER program based on parameters for a 11 design with no covariates Exhibit 8: Stata output from the RDPOWER program based on parameters for a 12 design with no covariates Exhibit 9: Stata output from the RDPOWER program based on parameters for a 13 design with no covariates

3 1 Introduction In planning any research study, it is crucial that the design be sufficiently sensitive to detect effects of a size that the treatment is expected to produce. Thus wise design requires statistical power and precision computations to ensure that effects that are expected have a high probability of being detected by the design. In designs that involve simple random sampling and assign individuals to treatments, power and precision calculations are straightforward. Statistical power at a given statistical significance level is determined entirely by the effect size expected, the sample size, and (if covariates are used) the number and effectiveness of covariates in explaining variance in the outcome (measured by the R 2 value). Thus choosing a design with a given statistical power simply involves finding the sample size that yields that power given the effect size and, if covariates are used, the R 2 value. However, not all research designs involve simple random sampling and assignment of individuals to treatments. Education studies frequently sample students within schools, which is not simple random sampling but a form of two-stage cluster sampling. Moreover experiments and quasi-experiments in education and the social sciences often assign intact groups of individuals (such as schools or classrooms) to treatments. Such studies are called cluster randomized or group randomized studies. In the experimental design literature, they are also called hierarchical experiments or experiments with groups as nested factors within treatments. In other contexts they are called multilevel experiments to emphasize the multiple levels of sampling (first groups, then individuals within groups). Studies with group assignment arise for many reasons. It may be impractical to assign individuals within the same group to different treatments (e.g., assign different students in the same classroom to different curricula or teaching methods). It may be politically infeasible to assign individuals within the same group to different treatments (e.g., assign different students in the same school to different class sizes or resource regimes). Finally, it may be theoretically impossible to assign different treatments to individuals within the same group (e.g., when the treatment is a whole school reform such as a management reform). The design of such studies poses challenges that are more complicated than studies with simple random sampling and assignment of individuals to treatments. For example, cluster sampling reduces efficiency, so that a cluster sample has less information than a simple random sample of the same size. (The reduction in information, where information is defined as increased estimation error variance, is sometimes called the design effect in survey statistics.) The reduction in efficiency of cluster sampling, combined with assignment of intact groups, means that studies with group assignment are less sensitive than studies with the same total sample size using simple random sampling and individual assignment. Moreover, because studies with group assignment involve multilevel sampling, statistical power computations are more complex and conventional software for

4 2 computing power is not intended for these designs (and will not provide the appropriate calculations). For example, in multilevel studies with group assignment, statistical power depends not only on total sample size, but also on how that sample size is allocated across levels of the design. A design that assigns 100 groups of 10 students each to treatments will have very different statistical power than a study that assigns 10 groups of 100 students each to treatments, even though they have the same total sample size of 1,000. The structure of the population variance decomposition (e.g., the proportion of total variation that arises between groups) also affects statistical power. This decomposition is often summarized by an intraclass correlation (ICC), which is literally the ratio of betweengroup variance to total variance in the population. As in the case in individual assignment, the use of covariates also affects statistical power in designs with group assignment. However, because group assignment with cluster sampling involves two levels of variation (group-level and individual within-group level), covariates can have effects on variation at either or both levels. (This is explicit when multilevel statistical analyses are used.) The effect of covariates depends on how much variation is explained at each level of the design and is usually summarized by the R 2 value at each level (i.e., a group level R 2 and an individual-level R 2 ). Thus statistical power calculations in designs with cluster sampling and group assignment to treatments require knowledge not only of effect size and sample sizes at each level of the design, but also information about population variance structure (summarized by the intraclass correlation) and (if covariates are used) information about the effectiveness of covariates in explaining variation summarized by the R 2 values at each level of the design. It is difficult to know (in the abstract and before a study has been carried out) what values of design parameters (intraclass correlations and R 2 values) should be used to plan a study. The purpose of the Online Variance Almanac (Web VA) is to provide users with a resource for principled planning of research designs: empirical evidence from surveys of academic achievement on design parameters for planning research studies.

5 3 The Web VA The Web VA allows users to access thousands of intraclass correlations and R 2 values from different datasets, grades, subjects, and subsamples. These values were obtained from analyses of four national surveys that used probability samples of students in kindergarten through 12 th grade in U.S. schools. The surveys included are listed in Exhibit 1. Exhibit 1: Surveys providing values for the Web VA The Early Childhood Longitudinal Program (ECLS) ( The Longitudinal Study of American Youth (LSAY) ( The National Education Longitudinal Study (NELS) ( Prospects: The Congressionally Mandated Study of Educational Growth and Opportunity. ( In analyzing the surveys to calculate the values available via the Web VA we considered the following: Achievement Domain. For each of these surveys we computed reference values for reading and mathematics achievement. Geographic Subgroups. For each of these surveys, we computed reference values for the entire nation and for geographic subgroups defined by region and by degree of urbanicity of the school location as defined by those surveys. School Subsample. For each achievement domain and geographic group, we also considered subsamples of schools defined by level of average achievement in the school and level of average SES in the school. Research Design. We computed design parameters for four different research designs. One design used no covariates. A second design used only a pretest on the same students one year (or in the NELS survey, two years) earlier. A third design used only demographic variables as covariates: a composite measure of socioeconomic status, gender, and indicators of Black and Hispanic race/ethnicity. A fourth design included both pretest and the demographic covariates. A full discussion of the methodology employed to calculate the ICCs and R2 values reported here is contained in Hedges and Hedberg s (2007) Intraclass Correlation Values for Planning Group Randomized Trials in Education. Also reported in that 2007 article are values for the nation as a whole (all regions, all urbanicities). The Web VA provides access to both the national values and values for specific subgroups (e.g., regions and urbanicities).

6 4 Using the Web VA The Center for Advancing Research and Communications in Science, Technology, Engineering, and Mathematics (ARC) provides access to the Web VA via Please follow these steps to utilize the Web VA resource to obtain reference values for computing statistical power in group randomized experiments. 1) Click on the Go to VA arrow from the web page above or go directly to URL: You will be taken to a web page like that shown in Exhibit 2. Exhibit 2: Opening page of the Web VA resource 2) If you would like to join our mailing list, please enter your address. Providing this information is optional; if you prefer not to provide this contact information, click on Skip to Search. 3) You will be asked to select filters for your query as shown in Exhibit 3. After each selection, you will need to click on Submit. These filters are: o Subject Mathematics Achievement Reading Achievement o Region [of the U.S.] All Regions Midwest Northeast South West

8 6 Exhibit 3: Filter selection page of the Web VA resource 4) Once all the filters have been selected, click on Submit. This will lead you to a new web page that displays a formatted table with the design parameters of interest and a separate table of sample information as shown in Exhibit 4. This page is formatted for easy printing. The parameter selections entered to generate these tables are included in the titles and footnotes.

9 7 Exhibit 4: Sample output from the Web VA CAUTION: Even though the total sample sizes of these surveys are large, the sample sizes for some of the subgroups are rather small. It is important to be sensitive to the sample sizes and particularly the standard errors of the design parameters here, which reflect the uncertainty of the estimates.

10 8 Stata program for computing power using Web VA design parameters Hedges and Hedberg developed the program RDPOWER, available for use with Stata 11, to calculate statistical power using output from the Web VA. To install this program in Stata, follow these four steps. a) Launch Stata 11. b) Make sure you are connected to the internet with administrative privileges. c) Type ssc install rdpower. d) Stata will report when the program is installed. Following are examples for using RDPOWER to calculate statistical power using Web VA results. With Stata open, you can also Type help rdpower to learn more about the program s features and to see additional examples. Example Power Analyses The Web VA gives parameters necessary to calculate statistical power for four designs: (1) design with no covariate (unconditional model); (2) design using pretest as a covariate; (3) design using demographic variables as covariates; and (4) design using both pretest and demographics covariates. Looking again at the sample Web VA results presented in Exhibit 4 (Design Parameters of interest reproduced in Exhibit 5), following are instructions for calculating power using RDPOWER for each design in turn. These results are based on 12 th grade math scores from NELS, using all schools from all regions and all urbanicities. For the following example, we will focus only on the intraclass correlation and R 2 values reproduced below. Exhibit 5: Design parameters based on a national probability sample of Grade 12 Mathematics Achievement from All Schools in The National Education Longitudinal Study (NELS). (All Regions, All Urbanicities) Covariates Used Intraclass Correlation Standard Error of the Intraclass Correlation Proportion of Variance Accounted for by Covariates (R 2 ) School Level None Pretest (One school and one student level variable) SES, race, gender (Four school and four student level variables) Pretest, SES, race, gender (Five school and five student level variables) Student Level

11 9 A design with no covariates (unconditional model) Suppose that the design being considered uses no covariates. Let us calculate power for a two-level randomized design in which students are sampled within schools and treatment is assigned to schools (that is, treatment is at level 2). The intraclass correlation value for this model ( icc2 ) is the ICC value in the first row (0.2396). To calculate power for detecting an effect size (a standardized mean difference or d-index ) of 0.25, with 8 schools per treatment and 25 students per school (8 * 2 * 25, for a total sample size of 400 students), we type the following into Stata: rdpower crd2, es(0.25) n(25) m(8) icc2(0.2386) This command tells Stata to use the RDPOWER program to calculate power for a two-level cluster randomized design ( crd2 ), with an effect size ( es ) of 0.25, with 25 students per cluster (i.e., n =25) and 8 clusters per treatment (i.e., m =8), with an intraclass correlation at level 2 of (i.e., icc2 =0.2386). Exhibit 6 shows the output from this command. This tells us that we have power with this design. Exhibit 6: Stata output from the RDPOWER program based on parameters for a design with no covariates

12 10 A design using pretest as a covariate In many situations pretest data are available for each student. We have determined the R 2 values associated with entering a group-centered pretest score and the school average pretest score. Important: In this model we have entered 1 level 2 variable ( l2vars ). Still using the intraclass correlation value from the unconditional model (in this case ), we now enter the R 2 values from the pretest covariate model (cells highlighted in yellow in Exhibit 5). In this case, the between-school (i.e., the proportion reduction in error at level 2, or pre2 ) R 2 value is and the within-school (i.e., the proportion reduction in error at level 1, or pre1 ) R 2 value is Next we will calculate power for a two-level randomized design in which treatment is at level 2. To calculate power for detecting an effect size of 0.25, with 8 schools per treatment and 25 students per school (for a total sample size of 400 students), type the following into Stata: rdpower crd2, es(0.25) n(25) m(8) icc2(0.2386) pre2(0.9746) pre1(0.7985) l2vars(1) This tells Stata to use the RDPOWER program to calculate power for a two-level cluster randomized design ( crd2 ) with an effect size ( es ) of 0.25, with 25 students per cluster ( n =25) and 8 clusters per treatment ( m =8), with an intraclass correlation ( icc2 ) of , a pre2 score of , a pre1 score of , and using 1 variable at level 2 ( l2vars =1). Exhibit 7 shows the output from this command. This tells us that the power for this design is

13 11 Exhibit 7: Stata output from the RDPOWER program based on parameters for a design with no covariates A design using demographic variables as covariates In many situations demographic data are available for each student. In our demographic covariates model we have determined the R 2 values associated with entering groupcentered demographic variables and their school averages. Important: In this model we have entered 4 level 2 variables ( l2vars ). Still using the intraclass correlation value from the unconditional models (in this case icc2 of ), we now enter the R 2 values from the demographic covariates model (cells highlighted in blue in Exhibit 5). As shown in Exhibit 5, the between-school (i.e., the pre2 ) R 2 value is and the within-school (i.e. the pre1 ) R 2 value is Next we will calculate power for a two-level cluster randomized design ( crd2 ) in which treatment is at level 2. To calculate power for detecting an effect size ( es ) of 0.25, with 8 schools per treatment and 25 students per school (for a total sample size of 400 students), type the following into Stata: rdpower crd2, es(0.25) n(25) m(8) icc2(0.2386) pre2(0.7817) pre1(0.1018) l2vars(4)

14 12 This tells Stata to use the RDPOWER program to calculate power for a two-level cluster randomized design ( crd2 ) with an effect size ( es ) of 0.25, with 25 students per cluster ( n = 25) and 8 clusters per treatment ( m =8), with a level-2 ICC ( icc2 ) of , with a level 2 PRE score ( pre2 ) of , a level 1 PRE score ( pre1 ) of , and using 4 variables at level 2 ( l2vars ). Exhibit 8 shows the output from this command, which tells us that the power for this design is Exhibit 8: Stata output from the RDPOWER program based on parameters for a design with no covariates A design using both pretest and demographics covariates In many situations pretest and demographic data are available for each student. In our pretest and demographic covariates model, we have determined the R 2 values associated with entering group-centered pretest and demographic variables and their school averages. Important: In this model we have entered 5 level 2 variables. Still using the ICC value from the unconditional models (in this case ), we now enter the R 2 values from the pretest and demographic covariates model (cells highlighted in orange in Exhibit 5). In this case, the between-school (i.e., the pre2 ) R 2 value is and the within-school (i.e., the pre1 ) R 2 value is

15 13 Next we will calculate power for a two-level cluster randomized design in which treatment is at level 2. To calculate power for detecting an effect size of 0.25, with 8 schools per treatment and 25 students per school (for a total sample size of 400 students), type the following into Stata: rdpower crd2, es(0.25) n(25) m(8) icc2(0.2386) pre2(0.9764) pre1(0.8010) l2vars(5) This tells Stata to use the RDPOWER program to calculate power for a two-level cluster randomized design ( crd2 ), with an effect size ( es ) of 0.25, with 25 students per cluster ( n =25) and 8 clusters per treatment ( m =8), with a level 2 intraclass correlation ( icc2 ) of , a level 2 PRE score ( pre2 ) of , a level 1 PRE score ( pre1 ) of , and using 5 variables at level 2 ( l2vars =5). Exhibit 9 shows the output from this command. This tells us that the power for this design is Exhibit 9: Stata output from the RDPOWER program based on parameters for a design with no covariates

16 14 Sources for more information For more information on power analysis in designs with cluster sampling and group assignment, see: Hedges, L.V., and Hedberg, E.C. (2007). Intraclass Correlations for Planning Group- Randomized Experiments in Education. Educational Evaluation and Policy Analysis, 29, Hedges, L. V. & Rhoads, C. (2009). Statistical Power Analysis in Education Research (NCSER ). Washington, DC: National Center for Special Education Research, Institute of Education Sciences, U.S. Department of Education. This report is available on the IES website at Raudenbush, S.W. (1997). Statistical Analysis and Optimal Design for Cluster Randomized Experiments. Psychological Methods, 2, Raudenbush, S.W., and Liu, X. (2000). Statistical Power and Optimal Design for Multisite Randomized Trials. Psychological Methods, 5(3): Schochet, P. (2008). Statistical Power for Random Assignment Evaluations of Education Programs. Journal of Educational and Behavioral Statistics, 33, For more information on the data sources used to estimate these design parameters see: Miller, Jon D. Longitudinal Study of American Youth, , and 2007 [Computer file]. ICPSR30263-v.1. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], doi /ICPSR U.S. Dept. of Education. Prospects: The Congressionally Mandated Study of Educational Growth and Opportunity. U.S. Department of Education. Institute of Education Sciences. National Center for Education Statistics. Early Childhood Longitudinal Study (United States): Kindergarten Class of , Kindergarten-Eighth Grade Full Sample [Computer file]. ICPSR28023-v1. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], doi /ICPSR U.S. Department of Education. National Center for Education Statistics. NATIONAL EDUCATION LONGITUDINAL STUDY: BASE YEAR THROUGH THIRD FOLLOW-UP, [Computer file]. ICPSR version. Washington, D.C.: U.S. Department of Education, National Center for Education Statistics [producer], Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor, doi /ICPSR06961.

17 15 For questions regarding these instructions, or comments on the Web VA, please contact ARC at

Stability of School Building Accountability Scores and Gains CSE Technical Report 561 Robert L. Linn CRESST/University of Colorado at Boulder Carolyn Haug University of Colorado at Boulder April 2002 Center

Chapter 5: Analysis of The National Education Longitudinal Study (NELS:88) Introduction The National Educational Longitudinal Survey (NELS:88) followed students from 8 th grade in 1988 to 10 th grade in

Title: Research on the efficacy of master s degrees for teachers Date: April 2014 Question: >> Do teachers who have master s degrees support better outcomes for students than their counterparts without

A peer-reviewed electronic journal. Copyright is retained by the first or sole author, who grants right of first publication to the Practical Assessment, Research & Evaluation. Permission is granted to

Degree Attainment of Undergraduate Student Borrowers in Four-Year Institutions: A Multilevel Analysis By Dai Li Dai Li is a doctoral candidate in the Center for the Study of Higher Education at Pennsylvania

U.S. Department of Education NCES 2011-460 NAEP Tools on the Web Whether you re an educator, a member of the media, a parent, a student, a policymaker, or a researcher, there are many resources available

One way ANOVA model 1. How much do U.S. high schools vary in their mean mathematics achievement? 2. What is the reliability of each school s sample mean as an estimate of its true population mean? 3. Do

SUCCESSFUL SCHOOL RESTRUCTURING A Report to the Public and Educators by the Center on Organization and Restructuring of Schools Fred M. Newmann and Gary G. Wehlage Distributed jointly by the American Federation

Imputing Attendance Data in a Longitudinal Multilevel Panel Data Set April 2015 SHORT REPORT Baby FACES 2009 This page is left blank for double-sided printing. Imputing Attendance Data in a Longitudinal

Research Brief: Master s Degrees and Teacher Effectiveness: New Evidence From State Assessments February 2012 CREDITS Arroyo Research Services is an education professional services firm that helps education

Introduction to Multilevel Modeling Using HLM 6 By ATS Statistical Consulting Group Multilevel data structure Students nested within schools Children nested within families Respondents nested within interviewers

Utah Comprehensive Counseling and Guidance Program Evaluation Report John Carey and Karen Harrington Center for School Counseling Outcome Research School of Education University of Massachusetts Amherst

Abstract Title Page Title: College enrollment patterns for rural Indiana high school graduates Authors and Affiliations: Mathew R. Burke, Elizabeth Davis, and Jennifer L. Stephan American Institutes for

The Effects of Early Education on Children in Poverty Anna D. Johnson Doctor of Education Student Developmental Psychology Department of Human Development Teachers College, Columbia University Introduction

Introductory Guide to HLM With HLM 7 Software 3 G. David Garson HLM software has been one of the leading statistical packages for hierarchical linear modeling due to the pioneering work of Stephen Raudenbush

Computer Use and Academic Performance- PISA 1 Running head: SCHOOL COMPUTER USE AND ACADEMIC PERFORMANCE Using the U.S. PISA results to investigate the relationship between school computer use and student

Main Section Overall Aim & Objectives The goals for this initiative are as follows: 1) Develop a partnership between two existing successful initiatives: the Million Hearts Initiative at the MedStar Health

Teacher Qualifications and First Grade Achievement: A Multilevel Analysis An Occasional Paper by: Robert G. Croninger, Ph.D. University of Maryland, College Park Jennifer King Rice, Ph.D. University of

NCEE EVALUATION BRIEF April 2014 STATE REQUIREMENTS FOR TEACHER EVALUATION POLICIES PROMOTED BY RACE TO THE TOP Congress appropriated approximately $5.05 billion for the Race to the Top (RTT) program between

College Readiness LINKING STUDY A Study of the Alignment of the RIT Scales of NWEA s MAP Assessments with the College Readiness Benchmarks of EXPLORE, PLAN, and ACT December 2011 (updated January 17, 2012)

ONC Data Brief No. 26 June 2015 Disparities in Individuals Access and Use of Health IT in 2013 Vaishali Patel, PhD MPH, Wesley Barker, MS, Erin Siminerio, MPH A number of national policies, initiatives

U.S. Department of Education March 2014 Participation and pass rates for college preparatory transition courses in Kentucky Christine Mokher CNA Key findings This study of Kentucky students who take college

The Impact of the NISL Executive Development Program on School Performance in Massachusetts: Cohort 2 Results July, 2011 John A. Nunnery, Ed.D. Old Dominion University Steven M. Ross, Ph.D. Johns Hopkins

An Introduction to Hierarchical Linear Modeling for Marketing Researchers Barbara A. Wech and Anita L. Heck Organizations are hierarchical in nature. Specifically, individuals in the workplace are entrenched

Online Appendix Nutrition and Cognitive Achievement: An Evaluation of the School Breakfast Program Data Appendix This appendix section provides further details about the data and the construction of the

ing Achievement Why are so many parents choosing to home school? Because it works. A 7 study by Dr. Brian Ray of the National Home Education Research Institute (NHERI) found that home educated students

GAO United States Government Accountability Office Report to Congressional Requesters February 2009 ACCESS TO ARTS EDUCATION Inclusion of Additional Questions in Education s Planned Research Would Help

Frequently Asked Questions About Using The GRE Search Service General Information Who can use the GRE Search Service? Institutions eligible to participate in the GRE Search Service include (1) institutions

Abstract Name Teacher Professional Development Programs in Grades 3-8: Promoting Teachers and Students Content Knowledge in Science and Engineering MSP Project Name: Partnership to Improve Student Achievement

Working Brief June 2011 The Middle Grades Student Transitions Study Navigating the Middle Grades and Preparing Students for High School Graduation Michael J. Kieffer Teachers College, Columbia University

of the CAHPS Hospital Survey December 21, 2006 Abstract A randomized Experiment of 27,229 discharges from 45 hospitals was used to develop adjustments for the effects of survey mode (Mail Only, Telephone

Abstract Title Page Not included in page count. Title: The Impact of The Stock Market Game on Financial Literacy and Mathematics Achievement: Results from a National Randomized Controlled Trial. Author(s):

Slide 1 Secondary Data Analysis Young I. Cho University of Illinois Slide 2 What is secondary data? Data collected by a person or organization other than the users of the data 2 of 20 Slide 3 Advantages

What Large-Scale, Survey Research Tells Us About Teacher Effects On Student Achievement: Insights from the Prospects Study of Elementary Schools Brian Rowan, Richard Correnti, and Robert J. Miller CPRE

Evaluating the Impact of Charter Schools on Student Achievement: A Longitudinal Look at the Great Lakes States Appendix B June 27 EPRU EDUCATION POLICY RESEARCH UNIT Education Policy Research Unit Division

Abstract Title Page Title: The Effects of Research-Based Curriculum Materials and Curriculum-Based Professional Development on High School Science Achievement: Results of a Cluster-Randomized Trial Authors

Effective Early Literacy Skill Development for English Language Learners: An Experimental Pilot Study of Two Methods* Jo Ann M. Farver, Ph.D. Department of Psychology University of Southern California,

Accounting for the Effect of Schooling and Abilities in the Analysis of Racial and Ethnic Disparities in Achievement Test Scores 1 James J. Heckman University of Chicago and Maria I. Larenas Center for

The Effect of Admissions Test Preparation: Evidence from NELS:88 Introduction For students planning to apply to a four year college, scores on standardized admissions tests--the SAT I or ACT--take on a

A detailed example of statistical analysis using the NELS:88 data file and ECB, to perform a longitudinal analysis of 1988 8 th graders in the year 2000: SPSS and AM statistical software example. Overall

A Study of the Efficacy of Apex Learning Digital Curriculum Sarasota County Schools November 2015 Copyright 2015 Apex Learning Inc. Apex Learning and the Apex Learning logo are either registered trademarks

Early Childhood Care and Education (ECCE) kindergarten or nursery is available to only some Egyptian children. Expanding ECCE should be a government priority, as ECCE is an excellent investment. ECCE improves

A Closer Look at the Common Core Standards for Mathematics Exploring the Domain Progressions in Grades K 8 Tool Kit 2 Expected Outcomes z Build understanding of the mathematical concepts within each domain

Methodology, business rules, and data components used in the implementation of the Washington State Equity Plan, 2015 Table of Contents I Table of Contents 1 II Objective and Overview of 2013 14 school

Chapter Three OVERVIEW OF CURRENT SCHOOL ADMINISTRATORS The first step in understanding the careers of school administrators is to describe the numbers and characteristics of those currently filling these