Global assessment of surgical skills (GASS): validation of a new instrument to measure global technical safety in surgical procedures.

Peter Nau, Erin Worden, Ryan Lehmann, Kyle Kleppe, Gregory J Mancini, Matt L Mancini, Bruce Ramshaw
Author Information
  1. Peter Nau: Department of Surgery, Section of Bariatric Surgery, University of Iowa Hospitals and Clinics, 200 Hawkins Drive, Iowa City, IA, 52242, USA. peter-nau@uiowa.edu.
  2. Erin Worden: Department of Surgery, Section of Bariatric Surgery, University of Iowa Hospitals and Clinics, 200 Hawkins Drive, Iowa City, IA, 52242, USA.
  3. Ryan Lehmann: Department of Surgery, Section of Bariatric Surgery, University of Iowa Hospitals and Clinics, 200 Hawkins Drive, Iowa City, IA, 52242, USA.
  4. Kyle Kleppe: Department of Surgery, Section of Foregut Surgery, University of Tennessee, Knoxville, TN, USA.
  5. Gregory J Mancini: Department of Surgery, Section of Foregut Surgery, University of Tennessee, Knoxville, TN, USA.
  6. Matt L Mancini: Department of Surgery, Section of Foregut Surgery, University of Tennessee, Knoxville, TN, USA.
  7. Bruce Ramshaw: CQInsights PBC, Knoxville, TN, USA.

Abstract

BACKGROUND: Broad implementation of the American Board of Surgery's entrustable professional activities initiative will require assessment instruments that are reliable and easy to use. Existing assessment instruments of general laparoscopic surgical skills have limited reliability, efficiency, and validity across the spectrum of formative (low-stakes) and summative (high-stakes) assessments. A novel six-item global assessment of surgical skills (GASS) instrument was developed and evaluated with a focus upon safe versus unsafe surgical practice scoring rubric.
METHODS: The GASS was developed by iterative engagement with expert laparoscopic surgeons and includes six items (economy of motion, tissue handling, appreciating operative anatomy, bimanual dexterity, achievement of hemostasis, overall performance) with a uniform three-point scoring rubric ("poor-unsafe", "adequate-safe", "good-safe"). To test inter-rater reliability, a cross-sectional study of four bariatric surgeons with experience ranging from 4 to 28 years applied the GASS and the global operative assessment of laparoscopic skills (GOALS) to 30 consecutive Roux-en-Y gastric bypass procedure operative videos. Inter-rater reliability was assessed for a simplified dichotomous "safe" versus "unsafe" scoring rubric using Gwet's AC.
RESULTS: The GASS inter-rater reliability was very high across all six domains (0.88-1.00). The GASS performed comparably to the GOALS inter-rater reliability scores (0.96-1.00). The economy of motion and bimanual dexterity items had the highest percentage of unsafe ratings (9.2% and 5.8%, respectively).
CONCLUSION: The GASS, a novel six-item instrument of general laparoscopic surgical skills, was designed with a simple scoring rubric (poor-safe, adequate-safe, good-safe) to minimize rater burden and focus feedback to trainees and promotion evaluations on safe surgical performance. Initial evaluation of the GASS is promising, demonstrating high inter-rater reliability. Future research will seek to assess the GASS against a broader spectrum of laparoscopic procedures.

Keywords

References

  1. Tsue TT, Dugan JW, Burkey B (2007) Assessment of surgical competency. Otolaryngol Clin North Am. https://doi.org/10.1016/j.otc.2007.07.005 [DOI: 10.1016/j.otc.2007.07.005]
  2. American Board of Surgery (2022) Entrustable professional activities. Available from https://www.absurgery.org/default.jsp?epahome . Accessed 1 Mar 2023
  3. Gerull KM, Loe M, Seiler K, McAllister J, Salles A (2019) Assessing gender bias in qualitative evaluations of surgical residents. Am J Surg. https://doi.org/10.1016/j.amjsurg.2018.09.029 [DOI: 10.1016/j.amjsurg.2018.09.029]
  4. Dill-Macky A, Hsu CH, Neumayer LA, Nfonsam VN, Turner AP (2022) The role of implicit bias in surgical resident evaluations. J Surg Educ. https://doi.org/10.1016/j.jsurg.2021.12.003 [DOI: 10.1016/j.jsurg.2021.12.003]
  5. Bhatti NI, Cummings CW (2007) Competency in surgical residency training: defining and raising the bar. Acad Med. https://doi.org/10.1097/ACM.0b013e3180555bfb [DOI: 10.1097/ACM.0b013e3180555bfb]
  6. Gauvin G, Hay K, Hopman W, Hurton S, Lim S, Zevin B, Jalink D, Nanji S (2021) Competency-based education in general surgery: are Canadian residents ready? Can J Surg. https://doi.org/10.1503/cjs.011520 [DOI: 10.1503/cjs.011520]
  7. Cheung K, Rogoza C, Chung AD, Kwan BYM (2022) Analyzing the administrative burden of competency based medical education. Can Assoc Radiol J. https://doi.org/10.1177/08465371211038963 [DOI: 10.1177/08465371211038963]
  8. Ott MC, Pack R, Cristancho S, Chin M, Van Koughnett JA, Ott M (2022) “The most crushing thing”: understanding resident assessment burden in a competency-based curriculum. J Grad Med Educ. https://doi.org/10.4300/JGME-D-22-00050.1 [DOI: 10.4300/JGME-D-22-00050.1]
  9. Champagne BJ (2013) Effective teaching and feedback strategies in the OR and beyond. Clin Colon Rectal Surg. https://doi.org/10.1055/s-0033-1356725 [DOI: 10.1055/s-0033-1356725]
  10. St John A, Caturegli I, Kubicki NS, Kavic SM (2020) The rise of minimally invasive surgery: 16 year analysis of the progressive replacement of open surgery with laparoscopy. JSLS. https://doi.org/10.4293/JSLS.2020.00076 [DOI: 10.4293/JSLS.2020.00076]
  11. Mattingly AS, Chen MM, Divi V, Holsinger FC, Saraswathula A (2023) Minimally invasive surgery in the United States, 2022: understanding its value using new datasets. J Surg Res. https://doi.org/10.1016/j.jss.2022.08.006 [DOI: 10.1016/j.jss.2022.08.006]
  12. Harrysson IJ, Cook J, Sirimanna P, Feldman LS, Darzi A, Aggarwal R (2014) Systematic review of learning curves for minimally invasive abdominal surgery: a review of the methodology of data collection, depiction of outcomes, and statistical analysis. Ann Surg. https://doi.org/10.1097/SLA.0000000000000596 [DOI: 10.1097/SLA.0000000000000596]
  13. Van Workum F, Stenstra MHB, Berklemans GHK, Slaman AE, van Berge Henegouwen MI, Gisbertz SS, van den Wildenberg FJH, Polat F, Irino T, Nilsson M, Nieuwenhuijzen GAP, Luyer MD, Adang EM, Hannink G, Rovers MM, Rosman C (2019) Learning curve and associated morbidity of minimally invasive esophagectomy: a retrospective multicenter study. Ann Surg. https://doi.org/10.1097/sla.0000000000002469 [DOI: 10.1097/sla.0000000000002469]
  14. Weksler B (2018) Defining the learning curve for minimally invasive segmentectomy. J Thorac Cardiovasc Surg. https://doi.org/10.1016/j.jtcvs.2018.06.065 [DOI: 10.1016/j.jtcvs.2018.06.065]
  15. Doyle DD, Webber EM, Sidhu RS (2007) A universal global rating scale for the evaluation of technical skills in the operating room. Am J Surg. https://doi.org/10.1016/j.amjsurg.2007.02.003 [DOI: 10.1016/j.amjsurg.2007.02.003]
  16. Ijgosse WM, Leijte E, Ganni S, Luursema JM, Francis NK, Jakimowicz JJ, Botden SMBI (2020) Competency assessment tool for laparoscopic suturing: development and reliability evaluation. Surg Endosc. https://doi.org/10.1007/s00464-019-07077-2 [DOI: 10.1007/s00464-019-07077-2]
  17. Vassiliou MC, Feldman LS, Andrew CG, Bergman S, Leffondré K, Stanbridge D, Fried GM (2005) A global assessment tool for evaluation of intraoperative laparoscopic skills. Am J Surg. https://doi.org/10.1016/j.amjsurg.2005.04.004 [DOI: 10.1016/j.amjsurg.2005.04.004]
  18. Ferriss JS, Frost AS, Heinzman AB, Tsai R, Patterson D, Patzkowsky K, Blanck J, Bienstock JL (2021) Systematic review of intraoperative assessment tools in minimally invasive gynecologic surgery. J Minim Invasive Gynecol. https://doi.org/10.1016/j.jmig.2020.10.007 [DOI: 10.1016/j.jmig.2020.10.007]
  19. Hatala R, Cook DA, Brydges R, Hawkins R (2015) Constructing a validity argument for the objective structured assessment of technical skills (OSATS): a systematic review of validity evidence. Adv Health Sci Educ Theory Pract. https://doi.org/10.1007/s10459-015-9593-1 [DOI: 10.1007/s10459-015-9593-1]
  20. Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, Brown M (1997) Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. https://doi.org/10.1046/j.1365-2168.1997.02502.x [DOI: 10.1046/j.1365-2168.1997.02502.x]
  21. Goh AC, Goldfarb DW, Sander JC, Miles BJ, Dunkin BJ (2012) Global evaluative assessment of robotic skills: validation of a clinical assessment tool to measure robotic surgical skills. J Urol. https://doi.org/10.1016/j.juro.2011.09.032 [DOI: 10.1016/j.juro.2011.09.032]
  22. Kramp KH, van Det MJ, Hoff C, Lamme B, Veeger NJGM, Pierie JPEN (2015) Validity and reliability of global operative assessment of laparoscopic skills (GOALS) in novice trainees performing a laparoscopic cholecystectomy. J Surg Educ. https://doi.org/10.1016/j.jsurg.2014.08.006 [DOI: 10.1016/j.jsurg.2014.08.006]
  23. Gallagher AG, O’Sullivan GC, Leonard G, Bunting BP, McGlade KJ (2014) Objective structured assessment of technical skills and checklist scales reliability compared for high stakes assessments. ANZ J Surg. https://doi.org/10.1111/j.1445-2197.2012.06236.x [DOI: 10.1111/j.1445-2197.2012.06236.x]
  24. Jarocki A, Rice D, Kent M, Oh D, Lin J, Reddy RM (2022) Validity of robotic simulation for high-stakes examination: a pilot study. J Robot Surg. https://doi.org/10.1007/s11701-021-01258-9 [DOI: 10.1007/s11701-021-01258-9]
  25. Goldenberg MG, Lee JY, Kwong JCC, Grantcharov TP, Costello A (2018) Implementing assessments of robot-assisted technical skill in urological education: a systematic review and synthesis of the validity evidence. BJU Int. https://doi.org/10.1111/bju.14219 [DOI: 10.1111/bju.14219]
  26. Ryan JF, Mador B, Lai K, Campbell S, Hyakutake M (2022) Validity evidence for procedure-specific competence assessment tools in general surgery: a scoping review. Ann Surg. https://doi.org/10.1097/SLA.0000000000005207 [DOI: 10.1097/SLA.0000000000005207]
  27. Mayne A, Wilson L, Kennedy N (2020) The usefulness of procedure-based assessments in postgraduate surgical training within the intercollegiate surgical curriculum programme; a scoping review. J Surg Educ. https://doi.org/10.1016/j.jsurg.2020.03.005 [DOI: 10.1016/j.jsurg.2020.03.005]
  28. McQueen S, McKinnon V, VanderBeek L, McCarthy C, Sonnadara R (2019) Video-based assessment in surgical education: a scoping review. J Surg Educ. https://doi.org/10.1016/j.jsurg.2019.05.013 [DOI: 10.1016/j.jsurg.2019.05.013]
  29. Huang RJ, Limsui D, Triadafilopoulos G (2018) Video-based performance assessment in endoscopy: moving beyond “see one, do one, teach one”? Gastrointest Endosc. https://doi.org/10.1016/j.gie.2017.09.014 [DOI: 10.1016/j.gie.2017.09.014]

MeSH Term

Humans
Reproducibility of Results
Cross-Sectional Studies
Clinical Competence
Laparoscopy
Videotape Recording

Word Cloud

Created with Highcharts 10.0.0GASSsurgicalassessmentreliabilitylaparoscopicskillsscoringrubricoperativeinter-raterglobalinstrumentprocedureswillinstrumentsgeneralacrossspectrumnovelsix-itemdevelopedfocussafeversusunsafesurgeonssixitemseconomymotionbimanualdexterityperformanceGOALSbypasshigh000BACKGROUND:BroadimplementationAmericanBoardSurgery'sentrustableprofessionalactivitiesinitiativerequirereliableeasyuseExistinglimitedefficiencyvalidityformativelow-stakessummativehigh-stakesassessmentsevaluateduponpracticeMETHODS:iterativeengagementexpertincludestissuehandlingappreciatinganatomyachievementhemostasisoveralluniformthree-point"poor-unsafe""adequate-safe""good-safe"testcross-sectionalstudyfourbariatricexperienceranging428 yearsapplied30consecutiveRoux-en-YgastricprocedurevideosInter-raterassessedsimplifieddichotomous"safe""unsafe"usingGwet'sACRESULTS:domains88-1performedcomparablyscores96-1highestpercentageratings92%58%respectivelyCONCLUSION:designedsimplepoor-safeadequate-safegood-safeminimizeraterburdenfeedbacktraineespromotionevaluationsInitialevaluationpromisingdemonstratingFutureresearchseekassessbroaderGlobal:validationnewmeasuretechnicalsafetyClinicalcompetenceGastricGeneralsurgeryProcesshealthcareSurgicalVideotaperecording

Similar Articles

Cited By