Modeling multiple dependent variables in meta-analysis of single-case experimental design using multilevel modeling.

Eunkyeng Baek, Wen Luo
Author Information
  1. Eunkyeng Baek: Educational Psychology, Texas A&M University, 718E Harrington Tower, 4225 TAMU, College Station, TX, 77843-4225, USA. baek@tamu.edu.
  2. Wen Luo: Educational Psychology, Texas A&M University, 718E Harrington Tower, 4225 TAMU, College Station, TX, 77843-4225, USA.

Abstract

Although meta-analyses of single-case experimental design (SCED) often include multiple types of dependent variables (DVs), multiple DVs are rarely considered within models in the analysis. Baek et al. (Journal of Experimental Education, 90(4), 934-961, 2022) identified several statistical issues that arise when researchers fail to model multiple DVs in meta-analyses of SCED data. However, the degree to which non-modeling of multiple DVs impacts the results of the meta-analysis of SCED has not been fully examined. In this simulation study, we have systematically investigated the impact of non-modeling of multiple DVs when analyzing meta SCED data using multilevel modeling. The result demonstrates that modeling multiple DVs has advantages over the non-modeling option for meta-analysis of SCED. Modeling multiple DVs enables the determination of precise effects from different DVs in addition to the unbiased and accurate average effect and accurate estimates and inferences for the error variances at the study level as well as the observation level. The current study also reveals potential factors (i.e., the number of DVs, degree of heterogeneity in the level-1 error variances and autocorrelation, and presence of the moderator effect) that impact the precision and accuracy of the variance parameters.

Keywords

References

  1. Baek, E., & Ferron, J. M. (2013). Multilevel models for multiple-baseline data: Modeling across participant variation in autocorrelation and residual variance. Behavior Research Methods, 45(1), 65–74. [DOI: 10.3758/s13428-012-0231-z]
  2. Baek, E., & Ferron, J. M. (2020). Modeling heterogeneity of the level-1 error covariance matrix in multilevel models for single-case data. Methodology, 16, 166–185. [DOI: 10.5964/meth.2817]
  3. Baek, E., Moeyaert, M., Petit-Bois, M., Beretvas, S. N., Van den Noortgate, W., & Ferron, J. M. (2014). The use of multilevel analysis for integrating single-case experimental design results within a study and across studies. Neuropsychological Rehabilitation, 24, 590–606. [DOI: 10.1080/09602011.2013.835740]
  4. Baek, E., Beretvas, S. N., Van den Noortgate, W., & Ferron, J. M. (2020). Brief research report: Bayesian versus REML estimation with noninformative priors in multilevel single-case data. Journal of Experimental Education, 88, 698–710. [DOI: 10.1080/00220973.2018.1527280]
  5. Baek, E., Luo, W., & Henri, M. (2022). Issues and solutions in meta-analysis of single-case design with multiple dependent variables using multilevel modeling. Journal of Experimental Education, 90(4), 934–961. [DOI: 10.1080/00220973.2020.1821342]
  6. Barlow, D. H., Nock, M. K., & Hersen, M. (2009). Single case experimental designs: Strategies for studying behavior change (3rd ed.). Pearson.
  7. Beretvas, S. N., & Chung, H. (2008). A review of meta-analyses of single-subject experimental designs: Methodological issues and practice. Evidence-Based Communication Assessment and Intervention, 2, 129–141. [DOI: 10.1080/17489530802446302]
  8. Denis, J., Van den Noortgate, W., & Maes, B. (2011). Self-injurious behavior in people with profound intellectual disabilities: A meta-analysis of single-case studies. Research in Developmental Disabilities, 32, 911–923. [DOI: 10.1016/j.ridd.2011.01.014]
  9. Farmer, J., Owens, C.M., Ferron, J. M., &Allsopp, D. (2010). A review of social science single-case meta-analyses. Paper presented at the annual meeting of the American Education Research Association, Denver, CO.
  10. Ferron, J. M., Bell, B. A., Hess, M. R., Rendina-Gobioff, G., & Hibbard, S. T. (2009). Making Treatment Effect Inferences from Multiple-Baseline Data: The Utility of Multilevel Modeling Approaches. Behavior Research Methods, 41, 372–384. [DOI: 10.3758/BRM.41.2.372]
  11. Ferron, J. M., Farmer, J. L., & Owens, C. M. (2010). Estimating individual treatment effects from multiple-baseline data: A Monte Carlo study of multilevel modeling approaches. Behavior Research Methods, 42, 930–943. [DOI: 10.3758/BRM.42.4.930]
  12. Gast, D. L., & Ledford, J. R. (2014). Single case research methodology: Applications in special education and behavioral sciences. Routledge. [DOI: 10.4324/9780203521892]
  13. Hedges, L., Tipton, E., & Johnson, M. (2010). Robust variance estimation in meta-regression with dependent effect size estimates. Research Synthesis Methods, 1, 39–65. [DOI: 10.1002/jrsm.5]
  14. Hoogland, J. J., & Boomsma, A. (1998). Robustness studies in covariance structure modeling. An overview and a meta-analysis. Sociological Methods & Research, 26, 329–367. [DOI: 10.1177/0049124198026003003]
  15. Horner, R. H., & Odom, S. L. (2014). Constructing single case research designs: Logic and options. In T. R. Kratochwill & J. R. Levin (Eds.), Single-case intervention research: Methodological and statistical advances. American Psychological Association.
  16. Jamshidi, L., Heyvaert, M., Declercq, L., Fernández Castilla, B., Ferron, J. M., Moeyaert, M., Beretvas, S. N., Onghena, P., & Van den Noortgate, W. (2018). Methodological quality of meta-analyses of single-case experimental studies. Research in Developmental Disabilities, 79, 97–115. [DOI: 10.1016/j.ridd.2017.12.016]
  17. Joo, S.-H., & Ferron, J. M. (2019). Application of the within- and between-series estimators to non-normal multiple-baseline data: Maximum likelihood and Bayesian approaches. Multivariate Behavioral Research, 54, 666–689. [DOI: 10.1080/00273171.2018.1564877]
  18. Kazdin, A. E. (2011). Single-Case Research Designs: Methods for Clinical and Applied Settings (2nd ed.). Oxford University Press.
  19. Kazdin, A. E., & Kopel, S. A. (1975). On resolving ambiguities of the multiple-baseline design: Problems and recommendations. Behavior Therapy, 6, 601–608. [DOI: 10.1016/S0005-7894(75)80181-X]
  20. Kenward, M. G., & Roger, J. H. (1997). Small sample inference for fixed effects from restricted maximum likelihood. Biometrics, 53, 983–997. [DOI: 10.2307/2533558]
  21. Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M. & Shadish, W. R. (2010). Single-case designs technical documentation. Retrieved from What Works Clearinghouse website: http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf .
  22. Maggin, D., O’Keeffe, B., & Johnson, A. (2011). A quantitative synthesis of methodology in the meta-analysis of single-subject research for students with disabilities: 1985-2009. Exceptionality, 19, 109–135. [DOI: 10.1080/09362835.2011.565725]
  23. Moerbeek, M., van Breukelen, G. J. P., & Berger, M. P. F. (2000). Design issues for experiments in multilevel populations. Journal of Educational and Behavioral Statistics, 25, 271–284. [DOI: 10.2307/1165206]
  24. Moeyaert, M., Ugille, M., Ferron, J., Beretvas, S. N., & Van den Noortgate, W. (2013). The three-level synthesis of standardized single-subject experimental data: A Monte Carlo simulation study. Multivariate Behavioral Research, 48, 719–748. [DOI: 10.1080/00273171.2013.816621]
  25. Moeyaert, M., Ugille, M., Ferron, J., Beretvas, S. N., & Van den Noortgate, W. (2014). Three-level analysis of single-case experimental data: Empirical validation. The Journal of Experimental Education, 82, 1–21. [DOI: 10.1080/00220973.2012.745470]
  26. Moeyaert, M., Ugille, M., Ferron, J., Onghena, P., Heyvaert, M., Beretvas, S. N., & Van den Noortgate, W. (2015). Estimating intervention effects across different types of single-subject. School Psychology Quarterly, 30, 50–63. [DOI: 10.1037/spq0000068]
  27. Owens, C. M., & Ferron, J. M. (2012). Synthesizing single-case studies: A Monte Carlo examination of a three-level meta-analytic model. Behavior Research Methods, 44, 795–805. [DOI: 10.3758/s13428-011-0180-y]
  28. Petit-Bois, M., Baek, E. K., Van den Noortgate, W., Beretvas, S. N., & Ferron, J. M. (2016). The consequences of modeling autocorrelation when synthesizing single-case studies using a three-level model. Behavior Research Methods, 48, 803–812. [DOI: 10.3758/s13428-015-0612-1]
  29. Shadish, W. R. (2014). Analysis and meta-analysis of single case designs: An introduction. Journal of School Psychology, 52(2), 109–122. [DOI: 10.1016/j.jsp.2013.11.009]
  30. Shadish, W. R., & Rindskopf, D. M. (2007). Methods for evidence-based practice: Quantitative synthesis of single-subject designs. New Direction for Evaluation, 113, 95–109. [DOI: 10.1002/ev.217]
  31. Shadish, W. R., & Sullivan, K. J. (2011). Characteristics of single-case designs used to assess intervention effects in 2008. Behavior Research Methods, 43, 971–980. [DOI: 10.3758/s13428-011-0111-y]
  32. Shadish, W. R., Hedges, L. V., & Pustejovsky, J. E. (2014). Analysis and meta-analysis of single-case designs with a standardized mean difference statistic: A primer and applications. Journal of School Psychology, 52, 123–147. [DOI: 10.1016/j.jsp.2013.11.005]
  33. Shogren, K. A., Fagella-Luby, M. N., Bae, J. S., & Wehmeyer, M. L. (2004). The effect of choice-making as an intervention for problem behavior. Journal of Positive Behavior Interventions, 6, 228–237. [DOI: 10.1177/10983007040060040401]
  34. Ugille, M., Moeyaert, M., Beretvas, T., Ferron, J., & Van den Noortgate, W. (2012). Multilevel meta-analysis of single-subject experimental designs: A simulation study. Behavior Research Methods, 44, 1244–1254. [DOI: 10.3758/s13428-012-0213-1]
  35. Van den Noortgate, W., & Onghena, P. (2003a). Combining single-case experimental data using hierarchical linear models. School Psychology Quarterly, 18, 325–346. [DOI: 10.1521/scpq.18.3.325.22577]
  36. Van den Noortgate, W., & Onghena, P. (2003b). Hierarchical linear models for the quantitative integration of effects sizes in single-case research. Behavior Research Methods, Instruments, & Computers, 35, 1–10. [DOI: 10.3758/BF03195492]
  37. Van den Noortgate, W., & Onghena, P. (2008). A multilevel meta-analysis of single subject experimental design studies. Evidence-Based Communication Assessment and Intervention, 3, 142–151. [DOI: 10.1080/17489530802505362]
  38. Wampold, B. E., & Serlin, R. C. (2000). The consequence of ignoring a nested factor on measures of effect size in analysis of variance. Psychological Methods, 5, 425–433. [DOI: 10.1037/1082-989X.5.4.425]
  39. Wang, S., Cui, Y., & Parrila, R. (2011). Examining the effectiveness of peer-mediated and video-modeling social skills interventions for children with autism spectrum disorders: a meta-analysis in single-case research using HLM. Research in Autism Spectrum Disorders, 5, 562–569. [DOI: 10.1016/j.rasd.2010.06.023]

MeSH Term

Humans
Research Design
Computer Simulation
Educational Status
Research Personnel

Word Cloud

Created with Highcharts 10.0.0DVsmultipleSCEDmodelingexperimentaldesigndependentvariablesnon-modelingmeta-analysisstudyerrormeta-analysessingle-caseanalysisdatadegreeimpactusingmultilevelModelingaccurateeffectvarianceslevelAlthoughoftenincludetypesrarelyconsideredwithinmodelsBaeketalJournalExperimentalEducation904934-9612022identifiedseveralstatisticalissuesariseresearchersfailmodelHoweverimpactsresultsfullyexaminedsimulationsystematicallyinvestigatedanalyzingmetaresultdemonstratesadvantagesoptionenablesdeterminationpreciseeffectsdifferentadditionunbiasedaverageestimatesinferenceswellobservationcurrentalsorevealspotentialfactorsienumberheterogeneitylevel-1autocorrelationpresencemoderatorprecisionaccuracyvarianceparametersHeterogeneousstructureMeta-analysisModeratorMultilevelMultipleSingle-case

Similar Articles

Cited By

No available data.