Pages

3 critical education topics affecting U.S. students

Brown Center on Education releases annual report examining three distinct areas of education

education-reportAn annual report examines the persistent gender gap in reading performance, how the Common Core is impacting reading achievement, and how intrinsic motivation plays a key role in student engagement, and offers analyses in all three areas.
The study is the fourteenth Brown Center Report on American Education and is divided into three sections, each dedicated to an independent topic and each based on the best evidence available, which is further described in each section.
The reading gender gap
As author Tom Loveless notes, “girls outscore boys on practically every reading test given to a large population. And they have for a long time.”

The gap is not unique to the U.S. For instance, Finland’s girls scored 62 points higher on the PISA reading test than do its boys.
The report notes that biological and developmental differences, school practices that don’t necessarily attempt to fix the gap, and cultural influences are among the most widely-cited reasons for the reading gender gap.
Of eight national assessment data included in the report, test score gaps are statistically significant on all eight. The gaps also appear narrower among younger students and wider among older students.
Loveless also raises the question of what makes a gap small or large and illustrates the gaps with tables found in the report.
Measuring the Common Core’s effectiveness
The second section of the report seeks to address when the Common Core started having an impact on student learning.
It also raises the question of how educators and policy makers will know when the standards do influence student achievement if they have not yet had an impact.
One of the key goals in this section, Loveless notes, is “persuading readers that deciding when a policy begins is elemental to evaluating its effects. The question of a policy’s starting point is not always easy to answer. Yet the answer has consequences. You can’t figure out whether a policy worked or not unless you know when it began.”
The analysis uses surveys of state implementation to identify different Common Core starting points for states, and from that it compiles a second “early report card” on the standards’ impact.
Because states differ in how quickly and aggressively they implemented the Common Core, the report constructed two indexes, based on state education agency surveys, to model Common Core implementation.
A 2011 survey focuses on the number of programs on which states said they spent federal funds to implement the Common Core. A 2013 survey focuses on states’ responses regarding when they planned to complete full Common Core implementation in classrooms. The report uses fourth grade NAEP reading scores as the achievement measure, because “reading instruction is a key activity of elementary classrooms. …The impact of CCSS on reading instruction…will be concentrated in the activities of a single teacher in elementary schools.”
Results showed small advantages, and the report notes that because only four states are in the non-adopter category (the logical, though small, control group), big changes in one or two of those states can have impacts on results.
“Taken together, the 2011 and 2013 indexes estimate that NAEP reading gains from 2009–2013 were one to one and one-half scale score points larger in the strong CCSS implementation states compared to the states that did not adopt CCSS,” Loveless noted in the report.
“These differences, although certainly encouraging to CCSS supporters, are quite small, amounting to (at most) 0.04 standard deviations (SD) on the NAEP scale. A threshold of 0.20 SD—five times larger—is often invoked as the minimum size for a test score change to be regarded as noticeable,” he added in the conclusion.
Student engagement
The report examines engagement as it pertains to “the intensity with which students apply themselves to learning in school.”
The report uses PISA data as a measure of student engagement, noting that the international test is given to 15-year-old students, and many note that high school is a time when students lose interest in learning.
“This analysis concludes that American students are about average in terms of engagement. Data reveal that several countries noted for their superior ranking on PISA—e.g., Korea, Japan, Finland, Poland, and the Netherlands—score below the U.S. on measures of student engagement,” Loveless wrote. “Thus, the relationship of achievement to student engagement is not clear cut, with some evidence pointing toward a weak positive relationship and other evidence indicating a modest negative relationship.”
The unit of analysis–individual data, aggregated data, etc.–is critical “when examining data on students’ characteristics and their relationship to achievement,” as noted in the report.
When it comes to what PISA data recommends for policy makers, programs that are designed to increase student engagement should be evaluated on a small scale before wide implementation.
International evidence outlined in the report “does not justify wide-scale concern over current levels of student engagement in the U.S. or support the hypothesis that boosting student engagement would raise student performance nationally.”