Tuesday, December 5, 2017

Using a State-Developed Tool to Measure Students’ Knowledge and Skills at Kindergarten Entry

-


A new study from Regional Educational Laboratory Southwest examined the construct validity of the New Mexico’s Kindergarten Observation Tool and provides evidence for the tool’s validity and reliability.

The Kindergarten Observation Tool’s primary purpose is to inform instruction, so that kindergarten teachers can use the information about their students’ knowledge and skills from the tool to inform their curricular and pedagogical decisions. Many states have developed (or are using existing) kindergarten entry assessments to measure and document children’s knowledge and skills systematically when they enter school.

The study found support for using the Kindergarten Observation Tool to measure two distinct domains of students’ knowledge and skills: cognitive school readiness and noncognitive school readiness. The study also found support for generating an overall score of students’ general school readiness. Although there was not support for generating scores on the basis of developer’s six intended domains (physical development, health, and well-being; literacy; numeracy; scientific conceptual understanding; self, family, and community; and approaches to learning), kindergarten teachers can use scores from the two validated domains as well as ratings from individual items in the domains to better understand and plan for children’s knowledge and skills at the beginning of kindergarten. 

Details:
 
The purpose of this study was to determine whether there was scientific support for using the New Mexico Kindergarten Observation Tool (KOT) to measure distinct domains of children's knowledge and skills at kindergarten entry. The research team conducted exploratory and confirmatory factor analyses to identify the latent constructs (or domains) measured in the 2015 KOT field test. 
 
In addition, internal consistency analyses were conducted and Rasch modeling was applied to examine item functioning and differential item functioning among student subgroups. Correlational analyses were conducted to examine patterns of associations between validated KOT domains and an independent kindergarten assessment—the Dynamic Indicators of Basic Early Literacy Skills (DIBELS). 
 
Finally, the research team examined the proportion of classroom-level variance in children's KOT scores by calculating the variance partition coefficient after fitting four-level unconditional models. 
 
Factor analyses provided support for a two-domain structure measuring children's knowledge and skills in two distinct areas: (1) cognitive school readiness (or academic knowledge and skills) and (2) noncognitive school readiness (or learning and social skills) as well as support for a one-domain structure measuring children's general school readiness. 
 
In addition, these KOT domains were moderately correlated with the DIBELS; the KOT cognitive domain was more strongly correlated with DIBELS than the KOT noncognitive domain. For each of the 26 KOT items, rating scale categories functioned appropriately. Three KOT items demonstrated differential item functioning for student subgroups, which signals potential bias for these items. 
 
Additional work is required to determine whether those items are truly unfair to certain student subgroups. 
 
Finally, classroom-level variation in children's KOT ratings was found. Although there was not scientific support for generating KOT scores based on the state's six intended domains (Physical Development, Health, and Well-Being; Literacy; Numeracy; Scientific Conceptual Understanding; Self, Family, and Community; Approaches to Learning), the 2015 KOT field test produced valid and reliable measures of children's knowledge and skills across two distinct domains and for one overall score that kindergarten teachers can use to better understand and plan for individual children's knowledge and skills at the beginning of kindergarten. 
 
Recommended next steps for New Mexico include replication of construct validity analyses with the most recent version of the KOT, consultation with a content expert review panel to investigate further the three items flagged for potential item bias, and further investigation of the sources of classroom-level variance.

No comments: