Like this Product? Get CRESST News Every Month!

Don't miss out on the latest CRESST news including our FREE reports and products. Subscribe to the monthly CRESST E-newsletter right now!

We only use this address for the monthly CRESST E-Newsletter and will ask for confirmation before adding you to our list.



No thanks | Don't ask again

K-12


Since its inception, CRESST has conducted research, development and evaluation that improves Pre-K–12 public education across the United States. Our innovative methods and indicators for evaluating educational quality are in broad use, including comprehensive approaches for monitoring and improving schools and their programs. CRESST directors and researchers are making important contributions to the development of innovative K-12 assessments that measure the Common Core State Standards. CRESST has long-standing expertise in the application of emerging technologies, exploring technology's potential in the design, administration, and interpretation of assessments, in the dissemination of the center's products, and in stimulating communication with Pre-K–12 education policymakers, practitioners and the public.

Funded by the U.S. Department of Education¹s Institute of Education Sciences, CRESST and the UCLA Department of Psychology created the National Center for Advanced Technology in Schools (CATS). The center combines research on cognitive psychology, instruction, assessment, and new technologies including gaming to improve the understanding of fundamental math concepts of underperforming middle school students.

The CATS study includes partners from the University of Southern California, UC Santa Barbara, Arizona State University, and ETS, in addition to a national advisory faculty including several dozen technology and game experts, cognitive psychologists, and noted researchers in mathematics, science teaching, and learning.

Sample Game - Save Patch
Save Patch is an educational video game designed to promote student understanding of two foundational ideas about fractions and rational number addition: (a) the size of a rational number is relative to how one whole unit is defined; and (b) only identical units can be added to create a single numerical sum. This game was developed as a research testbed to test out design variables.



For more information, please visit the CATS website or contact:

Noelle Griffin, Ph.D., Assistant Director for Research and Evaluation
Phone: 310-825-8605 | Email:griffin@cse.ucla.edu



A recently completed project, the POWERSOURCE© study developed a new approach to the design of formative assessment tools in middle school mathematics and created professional development and instructional resources to support teachers' use.

Background
Although research clearly highlights the promise of formative assessment, this promise may be hard to fulfill. Evidence suggests that many teachers are unable to use information from benchmark tests or their own assessments because they lack the knowledge, materials, or curricular time to do so. To address these complex issues—that is, to enable teachers to use formative assessments more effectively and insure that the assessments used are of sufficiently high technical quality—we have developed a strategy based on research for learning and targeting fundamental principles of middle-school mathematics.

The specific purpose of this strategy, called POWERSOURCE,© is to provide assessment information and resources to middle-school teachers, with the aim of improving both teachers' and students' understanding of the key ideas that are the prerequisites to mastering algebra. The emphasis on algebra and pre-algebra is strategic, since failure to master Algebra I keeps many students from advancing in mathematics and graduating from high school.

The POWERSOURCE© intervention targets big ideas and related skills in four domains underlying success in Algebra 1:

1) Rational number equivalence (RNE)

2) Properties of arithmetic (PA; the distributive property)

3) Principles for solving linear equations (SE)

4) Application of core principles in these domains to other critical areas of mathematics, such as geometry and probability (RA)

Within each of the selected content areas, we designed a series of short POWERSOURCE© assessments (Checks for Understanding) to help teachers assess their students' understanding of basic mathematical principles and to connect their instruction and provide feedback to support deeper understanding. For example, the following figure shows some items from one of the Checks for Understanding of Rational Number Equivalence (RNE):

These items were intended to elicit students' understanding of how to find equivalent rational numbers, how one can use the multiplicative identity property to help find equivalent rational numbers, what rational number equivalence is, and procedures for finding equivalent numbers by using the multiplicative identity.

POWERSOURCE© teachers participated in an initial summer professional development and follow-up sessions during the school year, and used teacher handbooks that included the formative assessments, guidance on understanding student responses, and instructional support materials. In contrast, the comparison group received either district-designed professional development, or an alternative (non-Powersource©) professional development, both without access to POWERSOURCE© instructional supports. Here we present data obtained in the 2007–2008 school year from a randomized study using experimental comparisons of teachers using POWERSOURCE© with teachers using a traditional textbook based approach to the same content.

Research Questions
We hypothesize that POWERSOURCE© students will:

1) possess a better understanding of the basic mathematical principles contained within each domain;

2) be able to apply concepts they have learned, solve complex problems, and transfer the principles covered by the POWERSOURCE© domains.

Study and Design Sample
Seven districts, including 28 schools, more than 90 teachers, and over 4,000 students participated in the study. Teachers and/or schools were randomly assigned to POWERSOURCE© or control conditions. POWERSOURCE© teachers were given three Checks for Understanding for each domain—one prior to the first day's set of instructional materials, one in between the first and second days of instruction, and one after the second day of instruction. Students in the control group did not complete any of the Checks for Understanding. Thus, the control students and teachers had no exposure to any of the POWERSOURCE© materials or concepts during the school year. All students (POWERSOURCE© and control) completed a test of prerequisite knowledge (pretest) at the beginning of the school year and a transfer measure of math knowledge at the end of the school year. The figure below shows is an overview of the POWERSOURCE© sequence:

Major Results
POWERSOURCE© students outperformed comparison students on transfer measure for Properties of Arithmetic:

Overall, POWERSOURCE© students who initially are relatively higher performing, outperformed similar control students:

Conclusion

• POWERSOURCE's© short, targeted intervention focused on key mathematical principles and included limited professional development. POWERSOURCE© shows statistically significant results on student learning.

• POWERSOURCE© had more impact on relatively higher performing students than relatively lower performing ones.

• Results suggest that effects were strongest for the relatively most difficult content, Properties of Arithmetic.

For more information, please contact:
Julia Phelan, Ph.D., Senior Researcher
Phone: 310-206-4998 | Email: jcsv@ucla.edu



Background

The Bill and Melinda Gates Foundation has engaged the National Center for Research on Evaluation, Standards, and Student Testing (CRESST) to develop literacy assessments that can help teachers, schools, and districts to assess their students' progress toward college readiness in English language arts (ELA), history, and science. The CRESST assessments are aligned with the Common Core State Standards for English/Language Arts and designed to measure students' subject matter understanding and literacy development. In the two-day (90 minute) assessment, each student reads a variety of topic-related texts, responds to multiple-choice reading comprehension and subject matter questions, and composes an essay to explain or argue a central issue addressed by both the readings and coursework.

Task Components

Topics of Exemplar Assessments

For more information, please contact:

Mark Valderrama, Research Analyst
Phone: (310) 206 - 4139 | Email: valderrama@cse.ucla.edu

Julia Phelan, Ph.D., Senior Researcher
Phone: 310-206-4998 | Email: jcsv@ucla.edu


In collaboration with Research for Action (RFA), CRESST is conducting quasi-experimental studies of the implementation and impact of two interventions funded by the Bill and Melinda Gates Foundation. The studies support the transition from state assessments to the Common Core State Standards (CCSS). The Literacy Design Collaborative involves school districts, teachers, and other partners in a coordinated set of instructional modules and assessments that embody the CCSS in English language arts. The centerpiece of the approach are "template" tasks that teachers of subjects such as literature, social studies, and science use in order to engage sixth through twelfth grade students in reading, analyzing, and writing about a variety of text types. The Mathematics Design Collaborative (MDC) focuses on "formative assessment lessons" that can be embedded in classroom curriculum. Instruments developed by CRESST for both parts of the study include specially developed measures of: contextual variables, implementation fidelity, and student outcomes. Researchers are also collecting existing data on student performance and demographics.

With funding from the Bill and Melinda Gates Foundation, CRESST is continuing its evaluation of Green Dot Public Schools' reforms at Alain Leroy Locke High School in Los Angeles. In the Longitudinal Effects of the Locke Transformation Project. CRESST is incorporating a third year of student performance data into earlier studies. Using propensity matched treatment and comparison samples, the study examines how well students are performing in terms of school persistence, attendance, course-taking and completion, A-G completion rate, graduation rate, as well as achievement on standardized tests in English language arts and mathematics. The study also examines the intersection of teacher quality and school reform effects.

Overview

The Dynamic Language Learning Progressions (DLLP) Project, in partnership with the WIDA ASSETS Consortium, plans to:

1) develop research conjectured and empirically validated dynamic language learning progressions (DLLPs) encom passing the language development of students, pre-k through grade 12, both English proficient and English learners, for a range of academic language functions (for example, explanation, description, definition) needed for success in school;

2) inform, in conjunction with the WIDA standards, the WIDA consortium's development of summative and interim assessments;

3) develop materials to support the use of the DLLPs by teachers for instruction and formative assessment.


Schedule of Work

1) Developing DLLPs. Currently, we are creating a methodology for modeling a dynamic language learning progression. As a proof of concept, we are focusing on the language function "explanation", initially for the K-5 grade range. The progression will not be described by grade level expectations, but rather as key, potentially overlapping phases of language development. To populate the progression, we are generating data from a collection of oral and written explanations that are used in the service of justification and persuasion. The data sources are i) verbatim explanations that are contained in the existing literature on language development; ii) newly generated oral and written explanations from prompts presented to 90-120 K-5 children, strategically selected by grade, native/non-native English speakers, English Language Development levels, time in the U.S., and reading levels. Students will be presented with identical stimuli to prompt their oral and written explanations, and student responses will be analyzed using protocols developed for the project; iii) classroom observations of the use of explanations by teachers and students; and iv) teacher feedback on the defensibility of the progression and its potential uses.

2) Validation Studies. We will conduct iterative validation research of the DLLPs focused on two key objectives: i) to accurately reflect progressions of children's language development; and ii) to optimize the utility of the progressions for teachers. At this juncture we will either expand to include progressions of additional language functions and/or expand to include additional grade levels in the project.

3) Professional Development Materials. Using the products from Parts 1 and 2 of the project, we will create professional development materials to support teachers' use of the DLLPs for the purposes of instruction and formative assessment.


For More Information:

Please contact: Alison Bailey, Ph.D.,
Email: abailey@gseis.ucla.edu
Please visit the project website here



Funded by the Defense Advanced Research Project Agency (DARPA)/ENGAGE, CRESST is participating in a collaboration to develop and evaluate educational games for the purpose of increasing young learners' understanding and interest in science, technology, engineering, and mathematics (STEM).

The collaborative CRESST project, GAMECHANGER: Using Technology to Improve Young Children's STEM Learning, will ultimately produce innovative computer games with practical applications across early science curricula. Of particular interest are student abilities to transfer learning to performance outside the game context and content. CRESST will also lead the development and refinement of student outcome measures for developed games, resulting in a fully vetted set of measures of science content, metacognitive skills, and core socio-emotional outcomes.

For more information, please contact:
Noelle Griffin, Ph.D., Assistant Director
Phone: 310-825-8605 | Email: griffin@cse.ucla.edu

On behalf of WestEd, CRESST is conducting an evaluation of CALIPERS II, Using Simulations to Assess Complex Science Learning. Funded by the National Science Foundation, WestEd researchers are studying the potential of technology- and simulation-based assessments to provide high-quality evidence of complex performances in science tests for both accountability and formative goals. CRESST is developing program evaluation instruments; conferring with design and sampling; conducting observations and interviews of Calipers implementation, as well as providing data analysis. More information regarding this project can be found here.

For more information, please contact:
Joan Herman, Ph.D., Director
Phone: 310-206-3701 | Email: herman@cse.ucla.edu

CRESST is conducting a formative and summative evaluation of MOBILIZE: Mobilizing for Innovative Computer Science Teaching and Learning. MOBLIZE is a mathematics and science partnership including Center X in UCLA's Graduate School of Education & Information Studies Center X; the Center for Embedded Networked Sensing (CENS) at UCLA's Henry Samueli School of Engineering and Applied Science; and the Los Angeles Unified School District. The MOBILIZE project incorporates participatory sensing and hand-held technology into an innovative intervention designed to both improve students' core computer science and computational skills while building teacher and school instructional capacity. CRESST evaluation work includes a range of quantitative and qualitative measures, as well as development of new approaches to assessing computer science skills.

For more information, please contact:
Noelle Griffin, Ph.D., Assistant Director
Phone: 310-825-8605 | Email: griffin@cse.ucla.edu

CRESST is collaborating with the Public Broadcasting Service (PBS) to develop an indicator and reporting system to measure game-based outcomes of the Ready to Learn PBS KIDS program. Using their favorite PBS characters such as Curious George or Sid the Kid, the Ready to Learn (RTL) program combines content across different media—including video, online games, mobile applications, and off-line activities—in order to improve math and literacy learning for young children ages 2-8. CRESST researchers are developing a reporting system that includes as many as 14 separate report components, ranging from the amount of time that a child plays a particular game to the skills he or she has mastered. Reports will provide information to the child, teacher, and parent with additional features, such as performance averages and trends across different groups.

For more information, please contact:
Gregory Chung, Ph.D., Senior Researcher
Phone: 310-794-4392 | Email: greg@ucla.edu

In the Evaluation of the Enhanced Assessment Grants project, CRESST researchers are conducting a formative evaluation of a series of embedded assessment modules developed by WestEd. CRESST is also providing guidance on reliability studies of simulation-based assessments as well as assisting with validity studies. CRESST responsibilities include instrument development, data collection, classrooms observations, data analysis, and reports.

For more information, please contact:
Joan Herman, Ph.D., Director
Phone: 310-206-3701 | Email: herman@cse.ucla.edu

CRESST researchers are applying new extensions of hierarchical models to important education quality questions. Funded by the Institute of Education Sciences, researchers are estimating teacher effects over time, analyzing teacher effect profile trajectories, and examining the influence of school characteristics over teacher effect estimates. The study will also provide direct insight into equitable distribution of student achievement within a school.

For more information, please contact:
Joan Herman, Ph.D., Director
Phone: 310-206-3701 | Email: herman@cse.ucla.edu

Using both quantitative and qualitative measures, CRESST is conducting an evaluation of UCLA IMPACT: Inspiring Minds through a Professional Alliance of Community Teachers. IMPACT is an innovative 18-month teacher residency program. The evaluation will answer specific questions about the quality of IMPACT program implementation, processes, and effectiveness, in terms of both teacher education and student outcomes. Although the major focus is on quantitative indicators, such as student test data, CRESST is also including a qualitative data analysis to provide deeper information about student teachers' educational practice. The evaluation serves both summative and formative purposes; that is, it will provide results pertinent to overall program effectiveness as well as information that the program can use on an on-going basis for program improvement and refinement.

For more information, please contact:
Joan Herman, Ph.D., Director
Phone: 310-206-3701 | Email: herman@cse.ucla.edu

Using methodology and expertise from previous after school evaluations, CRESST is conducting the CDE After School Program Evaluation, supported by the California Department of Education. This 4-year statewide evaluation includes approximately 4,000 elementary and middle school sites plus 190 high school sites. As part of the evaluation, the CRESST research team collected data from California's STAR program, the California English Language Development Test, and the California High School Exit Exam, while analyzing high school student graduation rates across the state. Evaluators are measuring whether after school programs contribute to improved attendance, homework completion, graduation rates, and student achievement.

For more information, please contact:
Denise Huang, Ph.D., Senior Researcher
Phone: 310-206-9642 | Email: dhuang@cse.ucla.edu

CRESST is working in collaboration with American Education Solutions and participating districts to evaluate the Magnet Schools Assistance Program's (MSAP) impact on student achievement. CRESST researchers are using a rigorous quasi-experimental design with statewide, student‐level achievement test data and standardized, validated survey measures to conduct in‐depth analyses of academic outcomes for magnet school students compared to students at traditional schools. The evaluation is providing detailed insight into the most important goals of magnet schools that are consistent across all cooperating districts and the MSAP, specifically for each participating district. The project will culminate in a final year meta-analysis, drawing on data from all districts to test the features and conditions under which magnet school programs are most effective.

For more information, please contact:
Jia Wang, Ph.D., Senior Researcher
Phone: 310-267-4476 | Email: jwang@gseis.ucla.edu

For more than 18 years, CRESST has studied the effects of the LA's BEST After School Enrichment Program, including student achievement, long-term academic attainment, social development, health habits, dropout rates, citizenship, delinquency, crime, and other outcomes. Using advanced statistical models and new indicators of impact, the studies cross multiple sites. The following are two of the selected studies.

In Preparation of the 21st Century Skills in LA's BEST, CRESST explored the relationships between afterschool participation and three key 21st Century skills: self-efficacy, collaboration, and oral communication. Findings supported current and existing literature that self-efficacy is significantly related to both collaboration and oral communication skills. Additionally, researchers found that self-evaluations of students with higher program participation were in closer alignment to student academic performance and teacher ratings of student self-efficacy, collaboration, and oral communications.

In another study, CRESST examined whether participation in the LA's BEST after school program during students' elementary years affected their Course-Taking Patterns in Middle School. Researchers found that greater LA's BEST participation helped to improve students' math grade point averages and math performance on the California state test. Additionally, this study found that higher intensity of participation in LA's BEST led to higher GPAs in science and history in eighth-grade. For middle school course taking patterns, LA's BEST students with a minimum of 140 days participation were more likely to take algebra in 8th grade compared to non-participants.

For more information, please contact:
Joan Herman, Ph.D., Director
Phone: 310-206-3701 | Email: herman@cse.ucla.edu

In the Efficacy Study of a Diagnostic Formative Assessment for Middle School Science, CRESST is collaborating with WestEd on a 4-year study to measure the effectiveness of the Assessing Science Knowledge (ASK) diagnostic formative assessment system. ASK is an integral part of the Full Option Science System (FOSS) elementary science modules published by Delta Education, Inc. The WestEd/CRESST study is providing data on the nature of quality assessment practices that influence learning and the factors that affect the quality of such practice.

For more information, please contact:
Denise Huang, Ph.D., Senior Researcher
Phone: 310-206-9642 | Email: dhuang@cse.ucla.edu

In the When to Exit ELL Students project, CRESST researchers are examining the effect of ELL reclassification on subsequent student academic success, as measured by growth on annual state assessments, success on high school exit exams, and persistence in school. Funded by the Institute of Education Sciences, the CRESST team is identifying key student characteristics that predict ELL performance variability and academic success following reclassification. In particular, the CRESST team is investigating which school and district English language development strategies are the most successful in promoting academic success after reclassification.

For more information, please contact:
Joan Herman, Ph.D., Director
Phone: 310-206-3701 | Email: herman@cse.ucla.edu

In the Providing Support to States to Improve the Assessment of ELL project, CRESST is working closely with a small number of states on research to improve the assessments used for English Language Learners. The research examines the linguistic complexity and language demands in states' English language proficiency assessments and academic achievement assessments in mathematics and science. The effects of the use of specific accommodations and the growth patterns in academic proficiency for subgroups of ELLs and academic content are also being investigated. Researchers are also examining ELLs' opportunity to learn mathematics and science. The findings from these research areas will provide guidance for states in the design and use of ELL assessment.

For more information, please contact:
Joan Herman, Ph.D., Director
Phone: 310-206-3701 | Email: herman@cse.ucla.edu

The Partnership for Accessible Reading Assessment is a collaboration between CRESST, the National Center on Educational Outcomes at the University of Minnesota, and Westat, a research corporation in Rockville, Maryland. The research teams are developing research-based principles to make large-scale reading assessments more accessible to students with disabilities. The researchers are also developing new reading assessments suitable for large-scale evaluation of students with disabilities, which will provide valid results for students and schools. The research is funded by the U.S. Department of Education, Office of Special Education Programs.

For more information, please contact:
Jamal Abedi, Ph.D., Professor of Education
Phone: 530-754-9150 | Email: jabedi@ucdavis.edu