A primer for using multilevel models to meta-analyze single case design data with AB phases.

Jessica L Becraft, John C Borrero, Shuyan Sun, Anlara A McKenzie
Author Information
  1. Jessica L Becraft: Johns Hopkins University School of Medicine.
  2. John C Borrero: UMBC.
  3. Shuyan Sun: UMBC.
  4. Anlara A McKenzie: Kennedy Krieger Institute.

Abstract

Meta-analytic methods provide a way to synthesize data across treatment evaluation studies. However, these well-accepted methods are infrequent with behavior analytic studies. Multilevel models may be a promising method to meta-analyze single-case data. This technical article provides a primer for how to conduct a multilevel model with single-case designs with AB phases using data from the differential-reinforcement-of-low-rate behavior literature. We provide details, recommendations, and considerations for searching for appropriate studies, organizing the data, and conducting the analyses. All data sets are available to allow the reader to follow along with this primer. The purpose of this technical article is to minimally equip behavior analysts to complete a meta-analysis that will summarize a current state of affairs as it relates to the science of behavior analysis and its practice. Moreover, we aim to demonstrate the value of analyses of this sort for behavior analysis.

Keywords

References

  1. Alderman, N., & Knight, C. (1997). The effectiveness of DRL in the management and treatment of severe behaviour disorders following brain injury. Brain Injury, 11, 79-101. https://doi.org/10.1080/026990597123683
  2. Anglesea, M. M., Hoch, H., & Taylor, B. A. (2008). Reducing rapid eating in teenagers with autism: Use of a pager prompt. Journal of Applied Behavior Analysis, 41, 107-111. https://doi.org/10.1901/jaba.2008.41-107
  3. Association of Professional Behavior Analysts (2019). Overview of state laws to license or otherwise regulate practitioners of applied behavior analysis. https://cdn.ymaws.com/www.apbahome.net/resource/resmgr/pdf/state_regulation_of_ba_feb20.pdf
  4. Austin, J. L., & Bevan, D. (2011). Using differential reinforcement of low rates to reduce children's requests for teacher attention. Journal of Applied Behavior Analysis, 44, 451-461. https://doi.org/10.1901/jaba.2011.44-451
  5. Baek, E. K., Moeyaert, M., Petit-Bois, M., Beretvas, S. N., Van den Noortgate, W., & Ferron, J. M. (2014). The use of multilevel analysis for integrating single-case experimental design results within a study and across studies. Neurpsychological Rehabilitation, 24, 590-606. https://doi.org/10.1080/09602011.2013.835740
  6. Baer, D. M. (1977). Perhaps it would be better not to know everything. Journal of Applied Behavior Analysis, 10, 167-172. https://doi.org/10.1901/jaba.1977.10-167
  7. Becraft, J. L., Borrero, J. C., Davis, B., Mendres-Smith, A. E., & Castillo, M. (2018). The role of signals in two variations of differential reinforcement-of-low-rate procedures. Journal of Applied Behavior Analysis, 51, 3-24. https://doi.org/10.1002/jaba.431
  8. Becraft, J. L., Borrero, J. C., Mendres-Smith, A. E., & Castillo, M. (2017). Decreasing excessive bids for attention in a simulated early education classroom. Journal of Behavioral Education, 26, 371-393. https://doi.org/10.1007/s10864-017-9275-6
  9. Biosoft (2004) UnGraph for Windows (Version 5.0) [Computer software]. Author.
  10. Bonner, A. C., & Borrero, J. C. (2018). Differential reinforcement of low rate schedules reduce severe problem behavior. Behavior Modification, 42, 747-764. https://doi.org/10.1177/0145445517731723
  11. Catania, A. C. (2013). Learning (5th ed.) Prentice Hall.
  12. Cheung, M. W. L. (2015). metaSEM: An R package for meta-analysis using structural equation modeling. Frontiers in Psychology, 5, 1-7. https://doi.org/10.3389/fpsyg.2014.01521
  13. Cooper, H. (2016). Research synthesis and meta-analysis: A step-by-step approach (5th ed.). Inc: Sage Publishing.
  14. Dallery, J., & Raiff, B. R. (2014). Optimizing behavioral health interventions with single-case designs: From development to dissemination. Translational Behavioral Medicine, 4, 290-303. https://doi.org/10.1007/s13142-014-0258-z
  15. Ferster, C. B., & Skinner, B. F. (1957). Schedules of reinforcement. Skinner Foundation: B. F.
  16. Gadaire, D. M., Marshall, G., & Brissett, E. (2017). Differential reinforcement of low rate responding in social skills training. Learning and Motivation, 60, 34-40. https://doi.org/10.1016/j.lmot.2017.08.005
  17. Ganz, J. B., & Ayres, K. M. (2018). Methodological standards in single-case experimental design: Raising the bar. Research in Developmental Disabilities, 79, 3-9. https://doi.org/10.1016/j.ridd.2018.03.003
  18. Harrington, M., & Velicer, W. F. (2015). Comparing visual and statistical analysis in single-case studies using published studies. Multivariate Behavioral Research, 50, 162-183. https://doi.org/10.1080/00273171.2014.973989
  19. Hedges, L. V., Pustejovsky, J. E., & Shadish, W. R. (2012). A standardized mean difference effect size for single case designs. Research Synthesis Methods, 3, 224-239. https://doi.org/10.1002/jrsm.1052
  20. Hedges, L. V., Pustejovsky, J. E., & Shadish, W. R. (2013). A standardized mean difference effect size for multiple baseline designs across individuals. Research Synthesis Methods, 4, 324-341. https://doi.org/10.1002/jrsm.1086
  21. Jamshidi, L., Heyvaert, M., Declercq, L., Fernandez-Castilla, B., Ferron, J. M., Moeyaert, M., … Van den Noortgate, W. (2018). Methodological quality of meta-analyses of single-case experimental studies. Research in Developmental Disabilities, 79, 97-115. https://doi.org/10.1016/j.ridd.2017.12.016
  22. Jessel, J., & Borrero, J. C. (2014). A laboratory comparison of two variations of differential-reinforcement-of-low-rate procedures. Journal of Applied Behavior Analysis, 47, 314-324. https://doi.org/10.1002/jaba.114
  23. Kahng, S., Chung, K., Gutshall, K., Pitts, S. C., Kao, J., & Girolami, K. (2010). Consistent visual analyses of intrasubject data. Journal of Applied Behavior Analysis, 43, 35-45. https://doi.org/10.1901/jaba.2010.43-35
  24. Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2010). Single-case designs technical documentation. Retrieved from What Works Clearinghouse website: http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf
  25. Lennox, D. B., Miltenberger, R. G., & Donnelly, D. R. (1987). Response interruption and DRL for the reduction of rapid eating. Journal of Applied Behavior Analysis, 20, 279-284. https://doi.org/10.1901/jaba.1987.20-279
  26. Lipsey, M. W., & Wilson, D. B. (2000). Practical meta-analysis. Sage Publishing, Inc.
  27. Manolov, R., Guilera, G., & Sierra, V. (2014). Weighting strategies in the meta-analysis of single-case studies. Behavior Research Methods, 46, 1152-1166. https://doi.org/10.3758/s13428-013-0440-0
  28. McCormack, J. C., Elliffe, D., & Virues-Ortega, J. (2019). Quantifying the effects of the differential outcomes procedure in humans: A systematic review and a meta-analysis. Journal of Applied Behavior Analysis, 52, 870-892. https://doi.org/10.1002/jaba.578
  29. Moeyaert, M., Ferron, J. M., Beretvas, S. N., & Van den Noortgate, W. (2014). From a single-level analysis to a multilevel analysis of single-case experimental designs. Journal of School Psychology, 52, 191-211. https://doi.org/10.1016/j.jsp.2013.11.003
  30. Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & The PRISMA Group. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLOS Medicine, 6(7), 1-6. https://doi.org/10.1371/journal.pmed.1000097
  31. Package ‘metaSEM’ (2018). [Manual for the metaSEM package for R]. https://cran.r-project.org/web/packages/metaSEM/metaSEM.pdf
  32. Parker, R. I., & Vannest, K. J. (2012). Bottom-up analysis of single-case research designs. Journal of Behavioral Education, 21, 254-265. https://doi.org/10.1007/s10864-012-9153-1
  33. Parker, R. I., Vannest, K. J., & Davis, J. L. (2011). Effect size in single-case research: A review of nine nonoverlap techniques. Behavior Modification, 35, 303-322. https://doi.org/10.1177/0145445511399147
  34. Piper, A., Borrero, J. C., & Becraft, J. L. (2020). Differential reinforcement-of-low-rate procedures: A systematic replication with students with autism spectrum disorder. Journal of Applied Behavior Analysis, 53(2), 1058-1070. http://doi.org/10.1002/jaba.631
  35. Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed.). Sage Publications.
  36. Reichow, B., Barton, E. E., & Maggin, D. M. (2018). Development and applications of the single-case design risk of bias tool for evaluating single-case design research study reports. Research in Developmental Disabilities, 79, 53-64. https://doi.org/10.1016/j.ridd.2018.05.008
  37. Richman, D. M., Barnard-Brak, L., Grubb, L., Bosch, A., & Abby, L. (2015). Meta-analysis of noncontingent reinforcement effects on problem behavior. Journal of Applied Behavior Analysis, 48, 131-152. https://doi.org/10.1002/jaba.189
  38. Rindskopf, D., & Ferron, J. (2014). Using multilevel models to analyze single-case design data. In T. R. Kratochwill & J. R. Levin (Eds.), Single-case intervention research: Statistical and methodological advances (pp. 221-246). American Psychological Association. https://doi.org/10.1037/14376-008
  39. Rothstein, H. R., Sutton, A. J., & Borenstein, M. (Eds.). (2005). Publication bias in meta-analysis: Prevention, assessment, and adjustments. Ltd: John Wiley & Sons.
  40. Scruggs, T. E., Mastropieri, M. A., & Casto, G. (1987). 'Meta-analysis for single-subject research: When does it clarify, when does it obscure?': Response to Salzberg, Strain, and Baer. RASE: Remedial & Special Education, 8, 49-52. https://doi.org/10.1177/074193258700800210
  41. Shadish, W. R., Brasil, I. C. C., Illingworth, D. A., White, K. D., Galindo, R., Nagler, E. D., & Rindskopf, D. M. (2009). Using UnGraph to extract data from image files: Verification of reliability and validity. Behavior Research Methods, 41, 177-183. https://doi.org/10.3758/BRM.41.1.177
  42. Shadish, W. R., Hedges, L. V., Pustejovsky, J. E., Rindskopf, D. M., Boyajian, J. G., & Sullivan, K. J. (2014). Analyzing single-case designs: d, G, hierarchical models, Bayesian estimators, generalized additive models, and the hopes and fears of researchers about analyses. In W. R. Shadish, L. V. Hedges, T. R. Kratochwill, J. R. Levin, T. R. Kratochwill, & J. R. Levin (Eds.), Single-case intervention research: Methodological and statistical advances. American Psychological Association. https://doi.org/10.1037/14376-009
  43. Shadish, W. R., Kyse, E. N., & Rindskopf, D. M. (2013). Analyzing data from single-case designs using multilevel models: New applications and some agenda items for future research. Psychological Methods, 18, 385-405. https://doi.org/10.1037/a0032964
  44. Shadish, W. R., & Sullivan, K. J. (2011). Characteristics of single-case designs used to assess intervention effects in 2008. Behavior Research Methods, 43, 971-980. https://doi.org/10.3758/s13428-011-0111-y
  45. Shadish, W. R., Zelinsky, N. M., Vevea, J. L., & Kratochwill, T. R. (2016). A survey of publication practices of single-case design researchers when treatments have small or large effects. Journal of Applied Behavior Analysis, 49, 656-673. https://doi.org/10.1002/jaba.308
  46. Sham, E., & Smith, T. (2014). Publication bias in studies of an applied behavior-analytic intervention: An initial analysis. Journal of Applied Behavior Analysis, 47, 663-678. https://doi.org/10.1002/jaba.146
  47. Swaminathan, H., Rogers, H. J., & Horner, R. H. (2014). An effect size measure and Bayesian analysis of single-case designs. Journal of School Psychology, 52, 213-230. https://doi.org/10.1016/j.jsp.2013.12.002
  48. Van den Noortgate, W., & Onghena, P. (2003). Hierarchical linear models for the quantitative integration of effect sizes in single-case research. Behavior Research Methods, Instruments, and Computers, 35, 1-10. https://doi.org/10.3758/BF03195492
  49. Warton, D. I., & Hui, F. K. C. (2011). The arcsine is asinine: The analysis of proportions in ecology. Ecology, 92, 3-10. https://doi.org/10.1890/10-0340.1
  50. Zelinsky, N. M., & Shadish, W. (2018). A demonstration of how to do a meta-analysis that combines single-case designs with between-groups experiments: The effects of choice making on challenging behaviors performed by people with disabilities. Developmental Neurorehabilitation, 21, 266-278. https://doi.org/10.3109/17518423.2015.1100690

MeSH Term

Data Analysis
Humans
Meta-Analysis as Topic
Multilevel Analysis
Reinforcement, Psychology
Research Design

Word Cloud

Created with Highcharts 10.0.0databehaviorstudiessingle-caseprimermultilevelmethodsprovidemodelsmeta-analyzetechnicalarticlemodeldesignsABphasesusingdifferential-reinforcement-of-low-rateanalysesmeta-analysisanalysisMeta-analyticwaysynthesizeacrosstreatmentevaluationHoweverwell-acceptedinfrequentanalyticMultilevelmaypromisingmethodprovidesconductliteraturedetailsrecommendationsconsiderationssearchingappropriateorganizingconductingsetsavailableallowreaderfollowalongpurposeminimallyequipanalystscompletewillsummarizecurrentstateaffairsrelatessciencepracticeMoreoveraimdemonstratevaluesortsinglecasedesignhierarchicallinearmodelingquantitativereview

Similar Articles

Cited By