Educational programs consist of several features that together have an impact on student achievement, such as the use of small group activities, or the focus on vocabulary enhancement. What is difficult to know is which components are the ones that make the program work and determine the size of this effect. This information would be of great relevance for researchers and program developers to design effective interventions.
The What Works Clearinghouse has applied a new meta-analysis method, called Bayesian meta-analysis, with the aim of exploring to what extent program components explain the impact of interventions. To investigate the potential of this new method, WWC used a total of 29 studies on the effects of 25 early literacy interventions in K–3 on alphabetics. A pool of experts in early literacy developed a taxonomy describing the program components and coded the 29 studies accordingly. The WWC focused its analysis on 15 component domains (e.g., instructional practices to build comprehension skills).
Overall, results showed that most of the multifaceted early literacy interventions (72%) have positive impacts on alphabetic outcomes. On the association between a specific component domain and intervention effects, results suggest that the use of ‘testing and screening’ (e.g., using formative assessment) and ‘student placement in groups based on assessments’ are the components with the larger impact, followed by ‘non-academic student supports’. Other domains, such as ‘educator support’ or ‘organizational structures’ were found to have a negative association. However, the analysis of component domains explained only 9% of the variation in the interventions’ effects. This means that 91% of the variation in effects is due to other components that were not considered: contextual factors, characteristics related to intervention implementation (e.g., duration), study quality, etc. The authors declared that the statistical approach used is exploratory and practitioners should read results with caution.
From a methodological perspective, future research is needed after this first trial on the association between components and programs’ effects with the methods proposed by the WWC. The exploratory results provide insights for the use of Bayesian meta-analysis and taxonomies in coding intervention components. Understanding why programs work is relevant to designing and implementing effective interventions. Evaluating which components are associated with large impacts is, however, challenging. Future research has to account for other factors (e.g., unmeasured components, duration, implementer, etc.). Additionally, coding program components is not always easy because the primary studies do not provide all relevant data. Unreported components may affect accuracy of results.
|
Can we know the ingredients that make programs work?
by Michael Keany
Sep 27, 2023
Can we know the ingredients that make programs work?
By Marta Pelligrini, University of Cagliari, Italy
Educational programs consist of several features that together have an impact on student achievement, such as the use of small group activities, or the focus on vocabulary enhancement. What is difficult to know is which components are the ones that make the program work and determine the size of this effect. This information would be of great relevance for researchers and program developers to design effective interventions.
The What Works Clearinghouse has applied a new meta-analysis method, called Bayesian meta-analysis, with the aim of exploring to what extent program components explain the impact of interventions. To investigate the potential of this new method, WWC used a total of 29 studies on the effects of 25 early literacy interventions in K–3 on alphabetics. A pool of experts in early literacy developed a taxonomy describing the program components and coded the 29 studies accordingly. The WWC focused its analysis on 15 component domains (e.g., instructional practices to build comprehension skills).
Overall, results showed that most of the multifaceted early literacy interventions (72%) have positive impacts on alphabetic outcomes. On the association between a specific component domain and intervention effects, results suggest that the use of ‘testing and screening’ (e.g., using formative assessment) and ‘student placement in groups based on assessments’ are the components with the larger impact, followed by ‘non-academic student supports’. Other domains, such as ‘educator support’ or ‘organizational structures’ were found to have a negative association. However, the analysis of component domains explained only 9% of the variation in the interventions’ effects. This means that 91% of the variation in effects is due to other components that were not considered: contextual factors, characteristics related to intervention implementation (e.g., duration), study quality, etc. The authors declared that the statistical approach used is exploratory and practitioners should read results with caution.
From a methodological perspective, future research is needed after this first trial on the association between components and programs’ effects with the methods proposed by the WWC. The exploratory results provide insights for the use of Bayesian meta-analysis and taxonomies in coding intervention components. Understanding why programs work is relevant to designing and implementing effective interventions. Evaluating which components are associated with large impacts is, however, challenging. Future research has to account for other factors (e.g., unmeasured components, duration, implementer, etc.). Additionally, coding program components is not always easy because the primary studies do not provide all relevant data. Unreported components may affect accuracy of results.