Advertisement

A prospective comparison of live and video-based assessments of colonoscopy performance

Published:August 28, 2017DOI:https://doi.org/10.1016/j.gie.2017.08.020

      Background and Aims

      Colonoscopy performance is typically assessed by a supervisor in the clinical setting. There are limitations of this approach, however, because it allows for rater bias and increases supervisor workload demand during the procedure. Video-based assessment of recorded procedures has been proposed as a complementary means by which to assess colonoscopy performance. This study sought to investigate the reliability, validity, and feasibility of video-based assessments of competence in performing colonoscopy compared with live assessment.

      Methods

      Novice (<50 previous colonoscopies), intermediate (50-500), and experienced (>1000) endoscopists from 5 hospitals participated. Two views of each colonoscopy were videotaped: an endoscopic (intraluminal) view and a recording of the endoscopist’s hand movements. Recorded procedures were independently assessed by 2 blinded experts using the Gastrointestinal Endoscopy Competency Assessment Tool (GiECAT), a validated procedure-specific assessment tool comprising a global rating scale (GRS) and checklist (CL). Live ratings were conducted by a non-blinded expert endoscopist. Outcomes included agreement between live and blinded video-based ratings of clinical colonoscopies, intra-rater reliability, inter-rater reliability and discriminative validity of video-based assessments, and perceived ease of assessment.

      Results

      Forty endoscopists participated (20 novices, 10 intermediates, and 10 experienced). There was good agreement between the live and video-based ratings (total, intra-class correlation [ICC] = 0.847; GRS, ICC = 0.868; CL, ICC = 0.749). Intra-rater reliability was excellent (total, ICC = 0.99; GRS, ICC = 0.99; CL, ICC = 0.98). Inter-rater reliability between the 2 blinded video-based raters was high (total, ICC = 0.91; GRS, ICC = 0.918; CL, ICC = 0.862). GiECAT total, GRS, and CL scores differed significantly among novice, intermediate, and experienced endoscopists (P < .001). Video-based assessments were perceived as “fairly easy,” although live assessments were rated as significantly easier (P < .001).

      Conclusions

      Video-based assessments of colonoscopy procedures using the GiECAT have strong evidence of reliability and validity. In addition, assessments using videos were feasible, although live assessments were easier.

      Abbreviations:

      CI (confidence interval), CL (checklist), GiECAT (Gastrointestinal Endoscopy Competency Assessment Tool), GRS (global rating scale), ICC (intra-class correlation coefficient), SD (standard deviation)
      To read this article in full you will need to make a payment

      Purchase one-time access:

      Academic & Personal: 24 hour online accessCorporate R&D Professionals: 24 hour online access
      One-time access price info
      • For academic or personal research use, select 'Academic and Personal'
      • For corporate R&D use, select 'Corporate R&D Professionals'

      Subscribe:

      Subscribe to Gastrointestinal Endoscopy
      Already a print subscriber? Claim online access
      Already an online subscriber? Sign in
      Institutional Access: Sign in to ScienceDirect

      References

        • Walsh C.M.
        In-training gastrointestinal endoscopy competency assessment tools: types of tools, validation and impact.
        Best Pract Res Clin Gastroenterol. 2016; 30: 357-374
        • Sedlack R.E.
        • Coyle W.J.
        Assessment of competency in endoscopy: establishing and validating generalizable competency benchmarks for colonoscopy.
        Gastrointest Endosc. 2016; 83: 516-523
        • Sedlack R.E.
        The Mayo Colonoscopy Skills Assessment Tool: validation of a unique instrument to assess colonoscopy skills in trainees.
        Gastrointest Endosc. 2010; 72: 1125-1133
        • Walsh C.M.
        • Ling S.C.
        • Khanna N.
        • et al.
        Gastrointestinal endoscopy competency assessment tool: Reliability and validity evidence.
        Gastrointest Endosc. 2015; 81: 1417-1424.e2
        • Barton J.R.
        • Corbett S.
        • Vleuten C.P.
        • et al.
        The validity and reliability of a Direct Observation of Procedural Skills assessment tool: assessing colonoscopic skills of senior endoscopists.
        Gastrointest Endosc. 2012; 75: 591-597
        • Vassiliou M.C.
        • Kaneva P.A.
        • Poulose B.K.
        • et al.
        Global Assessment of Gastrointestinal Endoscopic Skills (GAGES): a valid measurement tool for technical skills in flexible endoscopy.
        Surg Endosc. 2010; 24: 1834-1841
        • Miller G.E.
        The assessment of clinical skills/competence/performance.
        Acad Med. 1990; 65: S63-S67
        • Beard J.
        Workplace-based assessment: the need for continued evaluation and refinement.
        Surgeon. 2011; 9: S12-S13
        • Vogt V.Y.
        • Givens V.M.
        • Keathley C.A.
        • et al.
        Is a resident’s score on a videotaped objective structured assessment of technical skills affected by revealing the resident’s identity?.
        Am J Obstet Gynecol. 2003; 189: 688-691
        • Tavares W.
        • Eva K.W.
        Exploring the impact of mental workload on rater-based assessments.
        Adv Health Sci Educ Theory Pract. 2013; 18: 291-303
        • Kneebone R.
        • Nestel D.
        • Yadollahi F.
        • et al.
        Assessing procedural skills in context: exploring the feasibility of an Integrated Procedural Performance Instrument (IPPI).
        Med Educ. 2006; 40: 1105-1114
        • Cheng A.
        • Eppich W.
        • Grant V.
        • et al.
        Debriefing for technology-enhanced simulation: a systematic review and meta-analysis.
        Med Educ. 2014; 48: 657-666
        • Aggarwal R.
        • Grantcharov T.
        • Milland T.
        • et al.
        Toward feasible, valid, and reliable video-based assessments of technical surgical skills in the operating room.
        Ann Surg. 2008; 247: 372-379
        • Aggarwal R.
        • Grantcharov T.
        • Moorthy K.
        • et al.
        An evaluation of the feasibility, validity, and reliability of laparoscopic skills assessment in the operating room.
        Ann Surg. 2007; 245: 992-999
        • Beard J.D.
        Setting standards for the assessment of operative competence.
        Eur J Vasc Endovasc Surg. 2005; 30: 215-218
        • Bonrath E.M.
        • Zevin B.
        • Dedy N.J.
        • et al.
        Error rating tool to identify and analyse technical errors and events in laparoscopic surgery.
        Br J Surg. 2013; 100: 1080-1088
        • Dath D.
        • Regehr G.
        • Birch D.
        • et al.
        Toward reliable operative assessment: the reliability and feasibility of videotaped assessment of laparoscopic technical skills.
        Surg Endosc Other Interv Tech. 2004; 18: 1800-1804
        • Friedman Z.
        • Katznelson R.
        • Devito I.
        • et al.
        Objective assessment of manual skills and proficiency in performing epidural anesthesia–video-assisted validation.
        Reg Anesth Pain Med. 2006; 31: 304-310
        • Hassanpour N.
        • Chen R.
        • Baikpour M.
        • et al.
        Video observation of procedural skills for assessment of trabeculectomy performed by residents.
        J Curr Ophthalmol. 2016; 28: 61-64
        • Larsen C.R.
        • Grantcharov T.
        • Schouenborg L.
        • et al.
        Objective assessment of surgical competence in gynaecological laparoscopy: development and validation of a procedure-specific rating scale.
        BJOG. 2008; 115: 908-916
        • Scott D.J.
        • Rege R.V.
        • Bergen P.C.
        • et al.
        Measuring operative performance after laparoscopic skills training: edited videotape versus direct observation.
        J Laparoendosc Adv Surg Tech A. 2000; 10: 183-190
        • Driscoll P.J.
        • Paisley A.M.
        • Paterson-Brown S.
        Video assessment of basic surgical trainees’ operative skills.
        Am J Surg. 2008; 196: 265-272
        • Thomas-Gibson S.
        • Rogers P.A.
        • Suzuki N.
        • et al.
        Development of a video assessment scoring method to determine the accuracy of endoscopist performance at screening flexible sigmoidoscopy.
        Endoscopy. 2006; 38: 218-225
        • Gupta S.
        • Anderson J.
        • Bhandari P.
        • et al.
        Development and validation of a novel method for assessing competency in polypectomy: direct observation of polypectomy skills.
        Gastrointest Endosc. 2011; 73: 1232-1239.e2
        • Walsh C.M.
        • Ling S.C.
        • Khanna N.
        • et al.
        Gastrointestinal Endoscopy Competency Assessment Tool: development of a procedure-specific assessment tool for colonoscopy.
        Gastrointest Endosc. 2014; 79: 798-807.e5
        • Sedlack R.E.
        • Shami V.M.
        • Adler D.G.
        • et al.
        Colonoscopy core curriculum.
        Gastrointest Endosc. 2012; 76: 482-490
        • Eisen G.M.
        • Baron T.H.
        • Dominitz J.A.
        • et al.
        Methods of granting hospital privileges to perform gastrointestinal endoscopy.
        Gastrointest Endosc. 2002; 55: 780-783
        • Romagnuolo J.
        • Enns R.
        • Ponich T.
        • et al.
        Canadian credentialing guidelines for colonoscopy.
        Can J Gastroenterol. 2008; 22: 17-22
        • Rostom A.
        • Jolicoeur E.
        Validation of a new scale for the assessment of bowel preparation quality.
        Gastrointest Endosc. 2004; 59: 482-486
        • Ghasemi A.
        • Zahediasl S.
        Normality tests for statistical analysis: a guide for non-statisticians.
        Int J Endocrinol Metab. 2012; 10: 486-489
        • Cook D.A.
        • Beckman T.J.
        Current concepts in validity and reliability for psychometric instruments: theory and application.
        Am J Med. 2006; 119 (166.e7-16.32)
        • Jonsson A.
        • Svingby G.
        The use of scoring rubrics: reliability, validity and educational consequences.
        Educ Res Rev. 2007; 2: 130-144
        • Swing S.R.
        • Clyman S.G.
        • Holmboe E.S.
        • et al.
        Advancing resident assessment in graduate medical education.
        J Grad Med Educ. 2009; 1: 278-286
        • Shrout P.E.
        • Fleiss J.L.
        Intraclass correlations: uses in assessing rater reliability.
        Psychol Bull. 1979; 86: 420-428
        • Carkeet A.
        Exact parametric confidence intervals for Bland-Altman limits of agreement.
        Optom Vis Sci. 2015; 92: 71-80
        • Ludbrook J.
        Confidence in Altman-Bland plots: a critical review of the method of differences.
        Clin Exp Pharmacol Physiol. 2010; 37: 143-149
        • Watkins M.
        • Portney L.
        Foundations of clinical research: applications to practice.
        Pearson Education, Upper Saddle River, NJ2009
        • Fleiss J.L.
        Reliability of measurement. In: Design and analysis of clinical experiments.
        John Wiley & Sons, New York, NY1999: 1-32
        • Grover S.C.
        • Garg A.
        • Scaffidi M.A.
        • et al.
        Impact of a simulation training curriculum on technical and nontechnical skills in colonoscopy: a randomized trial.
        Gastrointest Endosc. 2015; 82: 1072-1079
        • Beckmann C.R.
        • Lipscomb G.H.
        • Ling F.W.
        • et al.
        Computer-assisted video evaluation of surgical skills.
        Obstet Gynecol. 1995; 85: 1039-1041
        • Sedlack R.E.
        • Coyle W.J.
        • Obstein K.L.
        • et al.
        ASGE’s assessment of competency in endoscopy evaluation tools for colonoscopy and EGD.
        Gastrointest Endosc. 2014; 79: 1-7

      Linked Article

      • Video-based performance assessment in endoscopy: Moving beyond “see one, do one, teach one”?
        Gastrointestinal EndoscopyVol. 87Issue 3
        • Preview
          We are living in an era of incredible and rapid change in healthcare delivery, with economic forces, technological innovations, and even social media1 rapidly changing the practice of gastroenterology. Over the past 30 years, GI endoscopy has been shaped by improvements in video quality and processing, with each new generation of endoscopes offering greater diagnostic and therapeutic potential, and in the process increasing the breadth of diseases that are amenable to endoscopic management. However, despite the rapid advancements in endoscope technology, the traditional method of teaching endoscopy and assessing competency has hardly changed; namely, bedside proctoring with real-time feedback and hands-on instruction.
        • Full-Text
        • PDF