Advertisement

Gastrointestinal Endoscopy Competency Assessment Tool: reliability and validity evidence

Published:March 06, 2015DOI:https://doi.org/10.1016/j.gie.2014.11.030

      Background

      Rigorously developed and validated direct observational assessment tools are required to support competency-based colonoscopy training to facilitate skill acquisition, optimize learning, and ensure readiness for unsupervised practice.

      Objective

      To examine reliability and validity evidence of the Gastrointestinal Endoscopy Competency Assessment Tool (GiECAT) for colonoscopy for use within the clinical setting.

      Design

      Prospective, observational, multicenter validation study. Sixty-one endoscopists performing 116 colonoscopies were assessed using the GiECAT, which consists of a 7-item global rating scale (GRS) and 19-item checklist (CL). A second rater assessed procedures to determine interrater reliability by using intraclass correlation coefficients (ICCs). Endoscopists’ first and second procedure scores were compared to determine test-retest reliability by using ICCs. Discriminative validity was examined by comparing novice, intermediate, and experienced endoscopists’ scores. Concurrent validity was measured by correlating scores with colonoscopy experience, cecal and terminal ileal intubation rates, and physician global assessment.

      Setting

      A total of 116 colonoscopies performed by 33 novice (<50 previous procedures), 18 intermediate (50-500 previous procedures), and 10 experienced (>1000 previous procedures) endoscopists from 6 Canadian hospitals.

      Main Outcome Measurements

      Interrater and test-retest reliability, discriminative, and concurrent validity.

      Results

      Interrater reliability was high (total: ICC = 0.85; GRS: ICC = 0.85; CL: ICC = 0.81). Test-retest reliability was excellent (total: ICC = 0.91; GRS: ICC = 0.93; CL: ICC = 0.80). Significant differences in GiECAT scores among novice, intermediate, and experienced endoscopists were noted (P < .001). There was a significant positive correlation (P < .001) between scores and number of previous colonoscopies (total: ρ = 0.78, GRS: ρ = 0.80, CL: Spearman's ρ = 0.71); cecal intubation rate (total: ρ = 0.81, GRS: Spearman's ρ = 0.82, CL: Spearman's ρ = 0.75); ileal intubation rate (total: Spearman's ρ = 0.82, GRS: Spearman's ρ = 0.82, CL: Spearman's ρ = 0.77); and physician global assessment (total: Spearman's ρ = 0.90, GRS: Spearman's ρ = 0.94, CL: Spearman's ρ = 0.77).

      Limitations

      Nonblinded assessments.

      Conclusion

      This study provides evidence supporting the reliability and validity of the GiECAT for use in assessing the performance of live colonoscopies in the clinical setting.

      Abbreviations:

      CL (checklist), GiECAT (Gastrointestinal Endoscopy Competency Assessment Tool), GRS (global rating scale), ICC (intraclass correlation coefficient), PGA (physician global assessment)
      To read this article in full you will need to make a payment

      Purchase one-time access:

      Academic & Personal: 24 hour online accessCorporate R&D Professionals: 24 hour online access
      One-time access price info
      • For academic or personal research use, select 'Academic and Personal'
      • For corporate R&D use, select 'Corporate R&D Professionals'

      Subscribe:

      Subscribe to Gastrointestinal Endoscopy
      Already a print subscriber? Claim online access
      Already an online subscriber? Sign in
      Institutional Access: Sign in to ScienceDirect

      References

        • Beard J.D.
        • Marriott J.
        • Purdie H.
        • et al.
        Assessing the surgical skills of trainees in the operating theatre: a prospective observational study of the methodology.
        Health Technol Assess. 2011; 15 (1-162): i-xxi
        • Donato A.A.
        Direct observation of residents: a model for an assessment system.
        Am J Med. 2014; 127: 455-460
        • Beard J.
        Workplace-based assessment: the need for continued evaluation and refinement.
        Surgeon. 2011; 9: S12-S13
        • Miller G.E.
        The assessment of clinical skills/competence/performance.
        Acad Med. 1990; 65: S63-S67
        • Van der Vleuten C.P.M.
        • Schuwirth L.W.T.
        Assessing professional competence: from methods to programmes.
        Med Educ. 2005; 39: 309-317
        • Walsh C.M.
        • Ling S.C.
        • Khanna N.
        • et al.
        Gastrointestinal Endoscopy Competency Assessment Tool: development of a procedure-specific assessment tool for colonoscopy.
        Gastrointest Endosc. 2014; 79: 798-807.e5
        • Shute V.J.
        Focus on formative feedback.
        Rev Educ Res. 2008; 78: 153-189
        • Epstein R.M.
        Assessment in medical education.
        N Engl J Med. 2007; 356: 387-396
        • Ben-David M.F.
        The role of assessment in expanding professional horizons.
        Med Teach. 2000; 22: 472-477
        • American Educational Research Association, American Psychological Association, National Council on Measurement in Education
        Standards for educational and psychological testing.
        American Educational Research Association, Washington (DC)1999
        • Kane M.T.
        An argument-based approach to validity.
        Psychol Bull. 1992; 112: 527-535
        • Messick S.
        Validity.
        in: Educational measurement. American Council on Education and Macmillan, New York (NY)1989: 13-104
        • Swing S.R.
        • Clyman S.G.
        • Holmboe E.S.
        • et al.
        Advancing resident assessment in graduate medical education.
        J Grad Med Educ. 2009; 1: 278-286
        • Cook D.A.
        • Beckman T.J.
        Current concepts in validity and reliability for psychometric instruments: theory and application.
        Am J Med. 2006; 119: 166.e7-166.e16
        • Downing S.M.
        Validity: on meaningful interpretation of assessment data.
        Med Educ. 2003; 37: 830-837
        • Kane M.T.
        Current concerns in validity theory.
        J Educ Meas. 2001; 38: 319-342
        • Shrout P.E.
        • Fleiss J.L.
        Intraclass correlations: uses in assessing rater reliability.
        Psychol Bull. 1979; 86: 420-428
        • Weir J.P.
        Quantifying test-retest reliability using the intraclass correlation coefficient and the SEM.
        J Strength Cond Res. 2005; 19: 231-240
        • Cook D.A.
        • Zendejas B.
        • Hamstra S.J.
        • et al.
        What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment.
        Adv Health Sci Educ Theory Pract. 2014; 19: 233-250
        • Watkins M.
        • Portney L.
        Foundations of clinical research: applications to practice.
        Pearson Education, Inc, Upper Saddle River (NJ)2009
        • Gallagher A.G.
        • Ritter E.M.
        • Satava R.M.
        Fundamental principles of validation, and reliability: rigorous science for the assessment of surgical education and training.
        Surg Endosc. 2003; 17: 1525-1529
        • Downing S.M.
        Reliability: on the reproducibility of assessment data.
        Med Educ. 2004; 38: 1006-1012
        • Wright J.G.
        • Feinstein A.R.
        Improving the reliability of orthopaedic measurements.
        J Bone Joint Surg Br. 1992; 74: 287-291
        • Haynes S.N.
        • Richard D.C.
        • Kubany E.S.
        Content validity in psychological assessment: a functional approach to concepts and methods.
        Psychol Assess. 1995; 7: 238-247
        • Bould M.D.
        • Crabtree N.A.
        • Naik V.N.
        Assessment of procedural skills in anaesthesia.
        Br J Anaesth. 2009; 103: 472-483
        • Williams R.G.
        • Klamen D.A.
        • McGaghie W.C.
        Cognitive, social and environmental sources of bias in clinical performance ratings.
        Teach Learn Med. 2003; 15: 270-292
        • Eva K.W.
        • Hodges B.D.
        Scylla or Charybdis? Can we navigate between objectification and judgment in assessment?.
        Med Educ. 2012; 46: 914-919
        • Jelovsek J.E.
        • Kow N.
        • Diwadkar G.B.
        Tools for the direct observation and assessment of psychomotor skills in medical trainees: a systematic review.
        Med Educ. 2013; 47: 650-673
        • Van Hove P.D.
        • Tuijthof G.J.M.
        • Verdaasdonk E.G.G.
        • et al.
        Objective assessment of technical surgical skills.
        Br J Surg. 2010; 97: 972-987
        • Davies H.
        • Howells R.
        How to assess your specialist registrar.
        Arch Dis Child. 2004; 89: 1089-1093
        • Sedlack R.E.
        The Mayo Colonoscopy Skills Assessment Tool: validation of a unique instrument to assess colonoscopy skills in trainees.
        Gastrointest Endosc. 2010; 72 (1133.e1-3): 1125-1133
        • Sedlack R.E.
        Training to competency in colonoscopy: assessing and defining competency standards.
        Gastrointest Endosc. 2011; 74 (66.e1-2): 355
        • Xenodemetropoulos T.
        • Armstrong D.
        • Tse F.
        • et al.
        Resident practice audit in gastroenterology (RPAGE): an innovative approach to gastroenterology trainee evaluation and professional development [abstract].
        Can J Gastroenterol. 2013; 27: A069
        • Armstrong D.
        • Barkun A.
        • Bridges R.
        • et al.
        Canadian Association of Gastroenterology consensus guidelines on safety and quality indicators in endoscopy.
        Can J Gastroenterol. 2012; 26: 17-31
        • Kromann C.B.
        • Jensen M.L.
        • Ringsted C.
        The effect of testing on skills learning.
        Med Educ. 2009; 43: 21-27