Advertisement

Defining the learning curves of colorectal surgical trainees in colonoscopy using the Assessment of Competency in Endoscopy tool

Published:February 22, 2022DOI:https://doi.org/10.1016/j.gie.2022.02.019

      Background and Aims

      Gastroenterology fellows require on average 250 to 275 colonoscopies to achieve competency. For surgical trainees, 50 colonoscopies is deemed adequate. Two training pathways using different assessment methods make any direct comparison impossible. At the Mayo Clinic colonoscopy training of gastroenterology and colorectal surgery (CRS) fellows were merged in 2017, providing a unique opportunity to define the learning curves of CRS trainees using the Assessment of Competency in Endoscopy (ACE) evaluation tool.

      Methods

      In a single-center retrospective descriptive study, ACE scores were collected on colonoscopies performed by CRS fellows over a period of 4 academic years. By calculating the average scores at every 25 procedures of experience, the CRS colonoscopy learning curves were described for each core cognitive and motor skill.

      Results

      Twelve CRS fellows (men, 8; women, 4) had an average prior experience of 123 colonoscopies (range, 50-266) during the general surgical residency. During CRS fellowship, an average of 136 colonoscopies (range, 116-173) were graded per fellow. Although the competency goals for a few metrics were met earlier, most motor and cognitive ACE metrics reached the minimum competency thresholds at 275 to 300 procedures.

      Conclusions

      CRS fellows reached competency in colonoscopy at around 275 to 300 procedures of experience, a trajectory similar to previously reported data for gastroenterology fellows, suggesting little difference in the learning curves between these 2 groups. In addition, no trainee was deemed competent at the onset of training despite an average experience well over the 50 colonoscopies required during residency.

      Graphical abstract

      Abbreviations:

      ACE (Assessment of Competency in Endoscopy), CRS (colorectal surgery), FES (Fundamentals of Endoscopic Surgery)
      Over the past decade, an increasing emphasis has been placed on quality metrics and competency-based assessments in health care. For gastroenterology, this includes assessing and documenting trainees’ progression of core skills for basic colonoscopy. A commonly used tool for this is the validated Assessment of Competency in Endoscopy (ACE) tool developed at the Mayo Clinic and supported by the American Society for Gastrointestinal Endoscopy.
      ASGE Training Committee; Sedlack RE, Coyle WJ, Obstein KL, et al
      ASGE’s assessment of competency in endoscopy evaluation tools for colonoscopy and EGD.
      Validation studies using this tool have established learning curves for colonoscopy skills, defined minimal performance thresholds of competence, and revealed that previous training expectations by the gastroenterology societies of 140 colonoscopies were not adequate.
      • Sedlack R.E.
      • Coyle W.J.
      ACE Research Group
      Assessment of competency in endoscopy: establishing and validating generalizable competency benchmarks for colonoscopy.
      In these reports, it was determined that roughly 250 to 275 colonoscopies are required for the average trainee to achieve competence. As such, the American Society for Gastrointestinal Endoscopy updated its training recommendations on procedure volume and more importantly stressed the use of objective ongoing assessment throughout training with bedside assessment tools such as the ACE.
      ASGE Training Committee; Adler DG, Bakis G, Coyle WJ, et al
      Principles of training in GI endoscopy.
      ,
      ASGE Training Committee; Walsh CM, Umar SB, Ghassemi S, et al
      Colonoscopy core curriculum.
      During the same period that these changes were occurring in the gastroenterology community, the American Board of Surgery was pursuing a different path and developed a separate assessment tool called the Global Assessment of Gastrointestinal Endoscopic Skills and nationally adopted the learning and testing program entitled the Fundamentals of Endoscopic Surgery (FES).
      • Vassiliou M.C.
      • Kaneva P.A.
      • Poulos B.K.
      • et al.
      Global Assessment of Gastrointestinal Endoscopic Skills (GAGES): a valid measurement tool for technical skills in flexible endoscopy.
      ,
      • Vassiliou M.C.
      • Dunkin B.J.
      • Fried G.M.
      • et al.
      Fundamentals of endoscopic surgery: creation and validation of the hands-on test.
      Reports examining this tool and program have indicated that surgical trainees can achieve competence at 50 colonoscopies. Because gastroenterology and surgery trainees typically receive their endoscopic training separately and when assessment is performed have used different means to define and assess competence, it has been impossible to directly compare outcomes for the 2 groups.
      • Vassiliou M.C.
      • Kaneva P.A.
      • Poulose B.K.
      • et al.
      How should we establish the clinical case numbers required to achieve proficiency in flexible endoscopy?.
      In July 2017, in recognition by Mayo Clinic leadership that there can be only 1 standard of practice for endoscopic procedures, the endoscopy practices of gastroenterology and CRS were merged as were the endoscopic training environments. As a result, all gastroenterology and CRS staff were then held to the same set of quality metrics as part of their annual review process.
      • Kane S.V.
      • Chandrasekhara V.
      • Sedlack R.E.
      • et al.
      Credentialing for endoscopic practice: the Mayo Clinic model.
      Similarly, all gastroenterology and CRS fellows began training in the same environment and were assessed using the ACE tool to monitor skills progression and competency assessment. This provided us with a unique opportunity to describe the colonoscopy learning curves of surgical trainees reported here using the ACE tool and to compare them with the published performance results of gastroenterology trainees.

      Methods

      Overview

      In a 4-year retrospective single-center study, ACE colonoscopy performance scores of our CRS trainees were continuously collected and then analyzed at set milestones of training (each 25 colonoscopies of experience) with the primary objective of establishing the learning curves for each cognitive and motor skill for this cohort of learner.

      Subjects

      All CRS fellows who trained in colonoscopy at the Mayo Clinic in Rochester, Minnesota from July 2017 to June 2021 were included in this study. The collected data are from the routine teaching and assessments process that has been in use by the gastroenterology fellowship program for over a decade. As an educational project, no identifying patient information was collected. All identifying information of trainee and staff was stripped from the data. The Mayo Clinic Institutional Review Board approved this educational research protocol and waived the need for trainee consent in accordance with 45 C.F.R. 46.117 (c) (2).

      Data collection

      Colonoscopies performed by CRS fellows were assessed using the ACE colonoscopy tool, which has been previously described and validated.
      • Sedlack R.E.
      The Mayo Colonoscopy Skills Assessment Tool: validation of a unique instrument to assess colonoscopy skills in trainees.
      All cases included in this study involved fellow hands-on participation and were started with the intent of the fellow independently reaching the cecum and managing the findings. All cases of colonoscopy performed were for screening or surveillance purposes, and all were performed with patients under conscious sedation. Fellow performance data were recorded immediately after each supervised colonoscopy and entered directly into the ProVation endoscopy reporting system (ProVation Medical Inc, Minneapolis, Minn, USA) as part of the procedure record (although not part of the written report seen in the patient record). Before the staff was able to sign the procedure note, the trainee performance assessment had to be recorded on any procedure where a fellow was identified to have participated. Although some procedures did not have the ACE tool completed because of the CRS fellow being misidentified in ProVation as a surgical resident (rather than a fellow), these procedures were still captured for the individual’s total numbers and to track the order of subsequent evaluation completion. These data were then extracted from the ProVation database on a monthly basis to allow the training programs to generate performance and feedback reports for the fellows. For the purposes of this analysis, the performance data from the last 4 academic years were deidentified of any patient, staff, or trainee personal information.
      The ACE tool recorded 14 cognitive and motor skills (7 each) considered to be the core elements necessary to perform a competent colonoscopy (Tables 1 and 2). These cognitive and motor skills were scored using a 4-point Likert scale (1 = novice, 2 = intermediate, 3 = advanced, and 4 = superior). For each metric, the behaviors that defined each score were available on the scoring form to allow for reproducible scoring (Appendix 1, available online at www.giejournal.org). Scores of 1 to 3 were defined as performance in a progression toward competence, whereas a score of 4 indicated that the performance in that skill was believed to have been performed competently during that given procedure. In addition to these 14 metrics, the depth of furthest independent scope insertion, time to reach the cecum, withdrawal time, and independent polyp identification were recorded. These additional metrics allowed for the calculation of successful cecal intubation rates and times as well as polyp detection rates (Table 3).
      Table 1Motor skills
      No. of proceduresNo. of casesUse of air, water, and suctionScope steering techniqueFine tip controlLoop reductionMucosal visualizationAbility to apply toolOverall hands-on skills
      50122.2 (1.7-2.6)2.4 (2.1-2.7)2.5 (2.2-2.8)2.3 (1.9-2.6)2.5 (2.0-3.0)2.4 (2.0-2.8)2.3 (2.0-2.6)
      75812.6 (2.4-2.8)2.5 (2.3-2.7)2.5 (2.3-2.7)2.5 (2.3-2.7)2.7 (2.5-2.9)2.5 (2.3-2.7)2.6 (2.4-2.8)
      100892.7 (2.6-2.9)2.8 (2.7-3.0)2.7 (2.5-2.9)2.7 (2.6-2.9)2.9 (2.7-3.0)2.7 (2.5-2.9)2.8 (2.6-2.9)
      1251023.0 (2.9-3.1)2.9 (2.8-3.1)2.9 (2.7-3.0)2.9 (2.7-3.0)3.1 (2.9-3.2)2.9 (2.7-3.1)2.9 (2.8-3.0)
      1501313.0 (2.9-3.1)2.9 (2.8-3.1)2.9 (2.8-3.1)2.8 (2.7-3.0)3.0 (2.9-3.2)3.0 (2.8-3.2)2.9 (2.7-3.0)
      1751603.2 (3.1-3.3)3.2 (3.0-3.3)3.1 (3.0-3.3)3.1 (2.9-3.2)3.3 (3.2-3.4)3.1 (2.9-3.2)3.2 (3.1-3.3)
      2001323.1 (2.9-3.2)3.0 (2.9-3.2)3.0 (2.9-3.1)3.0 (2.8-3.1)3.1 (3.0-3.2)3.0 (2.8-3.2)3.0 (2.9-3.1)
      225763.2 (3.1-3.4)3.1 (3.0-3.3)3.03.1 (2.9-3.3)3.2 (3.0-3.3)3.4 (3.2-3.5)3.1 (2.8-3.3)3.1 (2.9-3.3)
      250803.2 (3.0-3.3)3.3 (3.1-3.4)3.2 (3.0-3.3)3.2 (3.1-3.4)3.3 (3.1-3.4)3.2 (3.0-3.4)3.2 (3.1-3.4)
      275613.4 (3.3-3.6)3.4 (3.2-3.6)3.4 (3.1-3.6)3.3 (3.1-3.5)3.5 (3.3-3.6)3.3 (3.1-3.6)3.4 (3.2-3.6)
      300293.6 (3.3-3.8)3.6 (3.3-3.8)3.5 (3.2-3.8)3.6 (3.3-3.9)3.5 (3.2-3.8)3.3 (2.8-3.8)3.5 (3.2-3.8)
      Competency threshold≥3.5≥3.5≥3.5≥3.5≥3.5≥3.5≥3.5
      The mean scores (95% confidence interval) for the motor skills are shown at each stage of training. Highlighted cells indicate the mean score reached or exceeded the adopted threshold of ≥3.5 (shown in the bottom row). Competency is achieved in nearly all motor skills at 300 procedures.
      Table 2Cognitive skills
      No. of procedural experienceNo. of casesFellows, knowledgePain managementLumen identificationPathology identificationAccurate location of pathologyKnowledge of toolOverall cognitive skills
      50122.7 (2.4-3.0)2.5 (2.2-2.8)2.4 (2.1-2.7)2.1 (1.5-2.7)2.5 (1.9-3.1)2.1 (1.6-2.6)3.0 (2.5-3.5)
      75812.8 (2.6-3.0)2.8 (2.6-2.9)2.6 (2.4-2.8)2.6 (2.4-2.8)3.0 (2.7-3.2)2.7 (2.5-3.0)2.9 (2.7-3.0)
      100892.8 (2.6-2.9)2.9 (2.7-3.0)2.8 (2.6-2.9)2.6 (2.4-2.9)2.9 (2.7-3.1)2.7 (2.5-2.9)2.9 (2.8-3.1)
      1251023.0 (2.8-3.1)3.0 (2.9-3.1)3.0 (2.8-3.1)2.8 (2.7-3.0)3.0 (2.8-3.2)2.9 (2.7-3.1)3.0 (2.9-3.2)
      1501313.0 (2.9-3.2)3.0 (2.8-3.1)3.0 (2.9-3.1)3.0 (2.8-3.2)3.0 (2.8-3.2)2.9 (2.7-3.1)3.0 (2.9-3.1)
      1751603.3 (3.2-3.4)3.2 (3.1-3.3)3.3 (3.2-3.4)3.2 (3.1-3.3)3.2 (3.1-3.4)3.2 (3.0-3.3)3.4 (3.3-3.5)
      2001323.2 (3.0-3.3)3.1 (3.0-3.2)3.1 (3.0-3.2)3.2 (3.0-3.3)3.2 (3.1-3.4)3.1 (3.0-3.3)3.2 (3.1-3.3)
      225763.2 (3.0-3.3)3.2 (3.0-3.4)3.2 (3.0-3.4)3.2 (3.0-3.4)3.3 (3.1-3.5)3.1 (2.9-3.3)3.2 (3.0-3.4)
      250803.2 (3.1-3.2)3.2 (3.1-3.4)3.3 (3.1-3.4)3.3 (3.1-3.4)3.3 (3.1-3.4)3.1 (3.0-3.3)3.3 (3.2-3.4)
      275613.6 (3.5-3.8)3.4 (3.2-3.6)3.4 (3.2-3.6)3.5 (3.3-3.7)3.5 (3.3-3.7)3.5 (3.3-3.7)3.6 (3.5-3.8)
      300293.6 (3.3-3.8)3.4 (3.1-3.8)3.6 (3.3-3.8)3.5 (3.1-3.9)3.5 (3.1-3.8)3.5 (2.9-3.8)3.8 (3.6-3.9)
      Competency threshold≥3.5≥3.5≥3.5≥3.5≥3.5≥3.5≥3.5
      The mean scores (95% confidence interval) for the motor skills are shown at each stage of training. Highlighted cells indicate the mean score reached or exceeded the adopted threshold of ≥3.5 (shown in the bottom row). This is achieved for nearly all cognitive skills by 275-300 procedures.
      Table 3Calculated metrics
      No. of procedural experienceNo. of casesAverage cecal intubation time (min)Successful cecal intubation rate (%)Polyp detection rate (%)Polyp miss rate (%)Rate of overall competence (%)
      501219.3 (14.4-24.1)45 (10-81)42 (9-74)43 (0-93)0
      758118.7 (16.5-21.0)58 (46-69)44 (33-55)29 (15-44)14 (6-21)
      1008916.9 (15.1-18.8)75 (65-84)52 (41-62)31 (18-45)16 (8-23)
      12510217.5 (15.8-19.2)72 (63-81)50 (40-60)36 (23-49)18 (10-25)
      15013114.3 (12.7-15.8)80 (72-87)52 (43-61)27 (16-37)18 (11-24)
      17516015.2 (13.6-16.7)77 (71-84)54 (46-62)23 (14-31)32 (25-40)
      20013214.2 (12.7-15.7)83 (77-90)58 (50-66)16 (7-24)23 (16-31)
      2257613.3 (11.4-15.4)88 (80-96)63 (52-74)14 (4-24)29 (19-40)
      2508010.8 (9.4-12.2)95 (90-100)56 (45-67)16 (6-27)29 (19-40)
      2756112.2 (9.9-14.6)92 (85-99)66 (53-78)19 (6-31)49 (36-62)
      3002912.9 (9.6-16.1)93 (82-100)59 (40-78)17 (0-36)57 (38-77)
      Competency threshold≤15≥90≥50≤25
      Average scores (95% confidence interval) for the time to reach the cecum and successful independent cecal intubation rates are shown. Shaded cells indicate the mean score reached or exceeded the respective adopted thresholds (shown in the bottom row). The rate of overall competence indicates the percentage of procedures at that stage where all separate motor, cognitive, and calculated competency benchmarks were reached.
      Staff have been using this tool for over a decade and are well versed in the questions and descriptors. When new staff begin, they are given a brief overview of the form and the descriptors for how each question should be scored based on performance (Appendix 1). These scoring gradations and the performance expectations required to receive each score are also available with a single click in the ProVation software when completing the evaluations. For the validation of this tool please see previous publications regarding methods and validity evidence.
      ASGE Training Committee; Sedlack RE, Coyle WJ, Obstein KL, et al
      ASGE’s assessment of competency in endoscopy evaluation tools for colonoscopy and EGD.
      ,
      • Sedlack R.E.
      • Coyle W.J.
      ACE Research Group
      Assessment of competency in endoscopy: establishing and validating generalizable competency benchmarks for colonoscopy.
      ,
      • Sedlack R.E.
      The Mayo Colonoscopy Skills Assessment Tool: validation of a unique instrument to assess colonoscopy skills in trainees.

      Analysis methods

      The scores for each metric were calculated at stages of every 25 procedures of experience using an average of the 20 ACE scores recorded around the distinct milestones in training. For example, for the 50th procedure interval, scores for the 20 procedures bracketing this (41st-60th colonoscopies) were averaged together. This is to ensure that no single procedure on a particularly difficult or easy patient would skew the individual’s average performance at that stage. Each metric’s average score was then plotted with the scores of the other training milestones (50th, 75th, 100th, etc) to establish learning curves for CRS fellows for each skill. Because a good portion of fellows did not complete over 300 procedures of total experience by the end of the fellowship, the number of evaluations available at a specific stage began dropping off after the 300th analysis interval. As such, the analysis was only carried out to this interval.
      The thresholds used to define minimal competency in each metric are seen at the bottom row of Table 1, Table 2, Table 3. These are the generalizable benchmarks previously developed from a large multicenter trial assessing general gastroenterology fellows using a “contrasting groups” standard setting method.
      • Sedlack R.E.
      • Coyle W.J.
      ACE Research Group
      Assessment of competency in endoscopy: establishing and validating generalizable competency benchmarks for colonoscopy.
      Specifically, average ACE cognitive and motor skill scores of ≥3.5 (on the 1-4 scale), cecal intubation rates of ≥90%, cecal intubation times of ≤15 minutes, polyp detection rates of ≥50%, and polyp miss rates ≤25% are shown. To clarify, polyp detection rates are based on the percentage of cases where the fellow independently identified at least 1 polyp as compared with all cases performed. The polyp miss rate, on the other hand, is the percentage of cases where a fellow missed at least 1 polyp (but was seen by the supervising staff) compared with all cases that had a polyp identified. It should also be noted that these established benchmarks are all minimum performance thresholds that define when a trainee is just crossing over into competence and are used to define the bare minimum expectations of when a trainee is able to operate independently. “Overall competence rate” (Table 3) is defined as the percentage of procedures at a given interval where the thresholds for overall cognitive and overall motor competence were both met. Finally, average scores awarded by each staff group (CRS vs gastroenterology) were compared for the overall cognitive and motor competence scores at each stage of training using a 2-tailed, pooled t test to determine if there were significant differences based on staff type. All statistical analysis was performed using JMP (version 14.1.0) software (SAS Institute Inc, Cary, NC, USA).

      Results

      Subjects

      Over the 4-year study period, there were 13 CRS trainees (9 men, 4 women). One individual was excluded from this analysis because of his experience of over 300 colonoscopies before coming to fellowship, placing all his evaluations outside the examined assessment intervals. The remaining 12 fellows (8 men, 4 women) began fellowship having performed an average of 123 colonoscopies (range, 50-266) during their general surgery residency. This prior experience was accounted for when determining the intervals where their subsequent procedures were graded and analyzed. For example, if a fellow had previously completed 63 colonoscopies before coming to fellowship, his or her first colonoscopy graded with the ACE tool during fellowship would be identified as his or her 64th procedure of overall experience for this analysis. During the study period, these fellows were recorded to have performed 1631 screening colonoscopies (average of 136 colonoscopies per fellow [range, 116-173]) that formed this analysis. ACE evaluations were completed on 1428 procedures (88%). Of these completed evaluations, 13 different CRS staff completed 295 evaluations (21%), whereas 52 different gastroenterology staff completed 1133 evaluations (79%).

      Performance

      The average scores for each of the 7 motor ACE metrics at each stage of training are shown (Table 1). Similarly, the 7 cognitive ACE metrics are shown over the same intervals (Table 2). Because all fellows entered fellowship having performed at least 50 prior procedures, the tables begin at this interval. The calculated metrics such as cecal intubation times and rates, polyp detection and miss rates, and overall competency rates are shown (Table 3).
      At the onset of this study, none of the trainees achieved any of the established ACE competency benchmarks during initial evaluations. CRS fellows first surpassed and maintained the polyp detection rates above the 50% threshold at 125 procedures of experience, whereas polyp miss rates of ≤25% were achieved at 175 procedures. An average cecal intubation time of ≤15 minutes was consistently achieved at 200 procedures. Independent cecal intubation rates did not reach the threshold of ≥90% until 250 procedures. The individual cognitive ACE skill averages reached the minimum competency threshold scores (≥3.5) at around 275 procedures, whereas the motor skills goals were generally reached at around 300 procedures. Two exceptions to these were “pain management” and “ability to apply the tool,” which were both met at 325 procedures. These 2 are not shown because the number by 325 procedures had tapered down to only 6 evaluations because most trainees did not reach this stage of training.
      A comparison of the average overall cognitive and motor scores awarded by CRS staff as compared with gastroenterology staff showed that the CRS staff generally awarded slightly higher scores (Table 4). In particular, the average overall cognitive competence scores reached the competency benchmark as early as 125 procedures by CRS staff grading but not until 275 by gastroenterology staff grading. Although notable, the statistical difference in average scores was not significant for most of the assessment intervals.
      Table 4Comparison of overall scores by staff type
      No. of procedural experienceNo. of cases (CRS/gastroenterology)Overall hands-on skillsP valeOverall cognitive skillsP value
      CRSGastroenterologyCRSGastroenterology
      500/12N/A2.3 (2.0-2.6)N/AN/A3.0 (2.5-3.5)N/A
      755/733.2 (2.5-3.9)2.5 (2.4-2.7)N/S3.0 (2.3-3.7)2.8 (2.7-3.0)NS
      10013/742.3 (1.9-2.7)2.9 (2.7-3.0)<.05
      Gastroenterology staff awarded statistically significant higher average scores than the CRS staff (at the 100th procedure interval for both metrics).
      2.5 (2.1-2.9)3.0 (2.9-3.2)<.05
      Gastroenterology staff awarded statistically significant higher average scores than the CRS staff (at the 100th procedure interval for both metrics).
      1259/933.7 (3.3-4.0)2.8 (2.7-3.0)<.053.7 (3.2-4.0)3.0 (2.8-3.1)<.05
      15025/1063.1 (2.8-3.4)2.8 (2.7-3.0)NS3.1 (2.8-3.4)3.0 (2.8-3.1)NS
      17530/1263.2 (3.0-3.5)3.2 (3.0-3.3)NS3.5 (3.3-3.7)3.4 (3.2-3.5)NS
      20033/993.4 (3.2-3.7)2.9 (2.7-3.0)<.053.5 (3.2-3.7)3.1 (3.0-3.2)<.05
      22513/633.5 (3.0-3.9)3.0 (2.9-3.2)NS3.5 (3.1-3.9)3.2 (3.0-3.3)NS
      25013/663.3 (3.0-3.6)3.2 (3.1-3.4)NS3.4 (3.1-3.7)3.3 (3.1-3.4)NS
      27512/493.3 (2.9-3.8)3.4 (3.2-3.6)NS3.4 (3.1-3.7)3.7 (3.6-3.8)NS
      3002/263.5 (2.3-4.0)3.5 (3.1-3.8)NS4.0 (3.4-4.0)3.7 (3.5-3.9)NS
      The average scores (95% confidence interval) given by the CRS staff and gastroenterology staff are shown at each stage of training for the overall hands-on and overall cognitive skills. The shaded cells represent when the average score achieved or surpassed the minimal competency threshold of 3.5. In general, the CRS staff tended to award higher scores, but a statistical difference was only seen at 2 stages (125 and 200 procedures) with P < .05.
      CRS, Colorectal surgery; NS, not significant; N/A, Not applicable.
      Gastroenterology staff awarded statistically significant higher average scores than the CRS staff (at the 100th procedure interval for both metrics).

      Discussion

      Screening colonoscopy is widely used in the prevention of colon cancer, but the success in preventing colon cancer depends of the quality and completeness of the examination.
      • Rex D.K.
      • Schoenfeld P.S.
      • Cohen J.
      • et al.
      Quality indicators for colonoscopy.
      Concerns have been raised that an abbreviated training pathway of only 50 procedures to gain privileges in colonoscopy may impact dysplasia detection in practice and lead to an increase in colon cancer rates.
      • Rabeneck L.
      • Paszat L.F.
      • Saskin R.
      Endoscopist specialty is associated with incident colorectal cancer after a negative colonoscopy.
      The counterargument is that surgical trainees are able to uniformly achieve accelerated competence in colonoscopy (as compared with gastroenterology fellows) as they work on hand–eye coordination skills as part of their everyday practice and that this is borne out in the validation research of the FES program. The purpose of this study was to define the actual learning curves for CRS trainees in colonoscopy and in so doing determine if indeed surgical trainees can achieve competence in an accelerated manner.
      Current recommendations for gastroenterology trainees suggest that a minimum of 250 patient-based colonoscopies should be performed before competency assessment is attempted.
      ASGE Training Committee; Adler DG, Bakis G, Coyle WJ, et al
      Principles of training in GI endoscopy.
      ,
      ASGE Training Committee; Walsh CM, Umar SB, Ghassemi S, et al
      Colonoscopy core curriculum.
      ,
      • Shaidi N.
      • Ou G.
      • Telford J.
      • et al.
      Establishing the learning curve for achieving competency in performing colonoscopy: a systematic review.
      This is an increase from past recommendations of 140 procedures and is based on learning curves and competency thresholds defined in studies using the ACE assessment tool.
      ASGE Training Committee; Sedlack RE, Coyle WJ, Obstein KL, et al
      ASGE’s assessment of competency in endoscopy evaluation tools for colonoscopy and EGD.
      ,
      • Sedlack R.E.
      • Coyle W.J.
      ACE Research Group
      Assessment of competency in endoscopy: establishing and validating generalizable competency benchmarks for colonoscopy.
      ,
      • Sedlack R.E.
      The Mayo Colonoscopy Skills Assessment Tool: validation of a unique instrument to assess colonoscopy skills in trainees.
      In contrast, the Society of American Gastrointestinal and Endoscopic Surgeons developed an alternate training pathway based on a simulation training/assessment curriculum (FES) to augment a bedside experience of as few as 50 live colonoscopies. Because of the differences in assessment tools and training methods used, it has been impossible until now to determine if the learning curves for CRS fellows really are accelerated compared with their gastroenterology counterparts or if this is simply a phenomenon based on the different assessment tools and bar-setting methods. This study is the first attempt to define the learning curves of surgical trainees in colonoscopy using the ACE tool and to allow a direct comparison with established gastroenterology learning curves.
      What we found was that at the beginning of training none of the surgical trainees, who had previously completed general surgical residencies and the FES program and had at least 50 colonoscopies of experience, was able to achieve the competency thresholds defined for the ACE tool metrics. (Of note, 1 fellow was an international trainee and did not previously partake in the FES program.) The learning curves we established here also demonstrated that CRS fellows on average achieved the minimal competency thresholds at essentially identical points in training as the historical data of gastroenterology trainees for each quality metric examined and, that like their gastroenterology counterparts, they also require roughly 275 (±25) procedures on average to meet performance expectations. Of note, although the CRS staff did generally award slightly higher average grades compared with gastroenterology staff, specifically in the cognitive skills, the differences in the scores were for the most part not statistically significant.
      If we assume that the ACE tool assessments are accurate, as prior validation studies suggest, then what accounts for the difference between these results and prior studies showing surgical trainees to be competent after as few as 50 colonoscopies? One would need to consider the difference in the competency threshold setting methods between the 2 training pathways as 1 possible factor. For the ACE tool, competency thresholds for each metric were established using a “contrasting groups” method where each procedure’s ACE metrics were recorded and grouped as competent or noncompetent based on an independent assessment of overall cognitive and motor competency for each individual procedure.
      • Sedlack R.E.
      • Coyle W.J.
      ACE Research Group
      Assessment of competency in endoscopy: establishing and validating generalizable competency benchmarks for colonoscopy.
      ,
      • Sedlack R.E.
      The Mayo Colonoscopy Skills Assessment Tool: validation of a unique instrument to assess colonoscopy skills in trainees.
      In contrast, although the surgical FES program also used a contrasting groups method, it based its division of groups on the a priori assumption that those with >100 combined EGD–colon experience were competent and those with <100 were not.
      • Vassiliou M.C.
      • Dunkin B.J.
      • Fried G.M.
      • et al.
      Fundamentals of endoscopic surgery: creation and validation of the hands-on test.
      This appears to be roughly based on the Residency Review Committee for Surgery recommendations of requiring 50 colonoscopies and 35 EGDs during training. Using this as the division between its 2 groups, the FES performance thresholds were established. It should come as no surprise then that someone with >100 combined procedures of experience would be able to reliably surpass the FES performance thresholds, because this is how the bar was defined. However, was the a priori assumption correct that experience of 100 procedures is the right division on which to define the score thresholds? The ACE data presented here and the fact that CRS fellows also do not appear to achieve competence until 275 to 300 procedures using this alternate assessment method might indicate that 100 procedures is not enough and that the FES bar was perhaps defined and set too low during the standard setting methods.
      Admittedly, potential biases must be acknowledged. During the study, the procedure numbers recorded by our database differed from the number of procedures self-reported by the trainees. Although we report an average of 136 procedures per individual, self-reported procedure logs document on average 149 procedures with all fellows reported to have completed the required 140 procedures required by national standards. Repeat analysis of the database confirmed the lower number of 136 and suggests that perhaps data entry at the time of procedures failed to reliably record the fellow as a participant of the case by the recording nurse and thus not part of our database record for analysis. Although a future prospective analysis can ensure this is addressed, this discrepancy does raise the possibility that reaching the competency benchmarks for CRS trainees may actually take slightly longer than the 275 to 300 procedures because these missing cases would shift scored procedures to a later stage of training. Again, a prospective study and monitoring the agreement between database and personal logs are recommend. Other biases are that staff in this study were not blinded to the type of trainee (gastroenterology vs CRS) or the amounts of prior experience possessed by trainees and as a result may have introduced the opportunity for grading bias. Also, the heavily disproportionate number of gastroenterology staff evaluations in relation to those of CRS staff needs to be considered as a potential bias. Finally, the data presented here represent only a small group of CRS fellows at a single program. A larger multicenter blinded prospective study directly comparing gastroenterology and surgical trainees would be ideal to determine not only if the learning curves presented here are supported but also if there is any measurable difference in the speed at which 1 group achieves competency over the other.
      In summary, the colonoscopy learning curves of surgical trainees appear to be identical to those of gastroenterology fellows and suggest that surgical trainees also require on average 275 to 300 colonoscopies to meet the ACE metrics defining minimal competence. Additionally, it appears that performance expectations by CRS staff may differ slightly but not significantly from their gastroenterology counterparts when using the ACE tool. Further research is needed to determine if these findings are supported.

      Appendix 1

      American Society for Gastrointestinal Endoscopy Assessment of Competency in Endoscopy (ACE) colonoscopy skills assessment tool

      Fellow:
      Staff:
      Date of procedure:
      Time of Intubation:
      Time of Maximal Insertion Extent:
      Time of Extubation:
      • 1.
        Fellow’s knowledge of the indication & pertinent medical issues (INR, Vitals, Allergies, PMH etc):
        • N/A. Not Assessed (ie, Fellow observed procedure only)
        • 1. Novice (Poor knowledge of patient’s issue, or started sedating without knowing the indication)
        • 2. Intermediate (Missed an Important element, ie, Allergies, Gastroenterology Surgical History or INR in pt on Coumadin)
        • 3. Advanced (Missed minor elements)
        • 4. Superior (Appropriate knowledge and integration of patient information)
      • 2.
        Management of patient discomfort during this procedure (Sedation Titration, Insufflation management, Loop reduction):
        • N/A Fellow observed
        • 1. Novice (Does not quickly recognize patient discomfort or requires repeated staff prompting to act)
        • 2. Intermediate (Recognizes pain but does not address cause [loop or sedation problems] in a timely manner)
        • 3. Advanced (Adequate recognition and corrective measures)
        • 4. Superior (Competent continuous assessment & management, ie, intermittently reassess level of sedation and comfort)
      • 3.
        Effective and efficient use of air, water and suction:
        • N/A. Not Assessed (ie, Fellow observed procedure only)
        • 1. Novice (Repeated prompting due to too much/little air, Inadequate washing or repeated suctioning of mucosa)
        • 2. Intermediate (Occasional Prompting due to too much/little air, Inadequate washing or repeated suctioning of mucosa)
        • 3. Advanced (Adequate use of air, water and suctioning, but room to improve on efficiency)
        • 4. Superior (Efficient and effective management of washing, suctioning and air)
      • 4.
        Lumen identification:
        • N/A. Not Assessed (ie, Fellow observed procedure only)
        • 1. Novice (Generally only able to recognize lumen if in direct view)
        • 2. Intermediate (Can grossly interpret large folds to help locate which direction the lumen is located)
        • 3. Advanced (Can use more subtle clues (Light/shadows, arcs of fine circular muscles in wall) but struggles at times)
        • 4. Superior (Quickly and reliably recognizes where lumen should be based on even subtle clues)
      • 5.
        Scope steering technique during advancement:
        • N/A. Not Assessed (ie, Fellow observed procedure only)
        • 1. Novice (Primarily “Two-hand knob steering”, Unable to perform two steering maneuvers simultaneously)
        • 2. Intermediate (Frequent 2-hand knob steering, Limited use of simultaneous steering maneuvers [ie, torque, knob, advance])
        • 3. Advanced (Primarily uses torque steering. Can perform simultaneous steering techniques)
        • 4. Superior (Effortlessly combines simultaneous steering techniques [torque, knob, advance] to navigate even many difficult turns)
      • 6.
        Fine tip control:
        • N/A. Not Assessed (ie, Fellow observed procedure only)
        • 1. Novice (Primarily gross tip control only, frequently in red out)
        • 2. Intermediate (Limited fine tip control. “frequently over-steers turns, struggles with biopsy forceps/snare targeting”)
        • 3. Advanced (loses fine control when keeping lumen or targeting tools at difficult turns when torque or knobs are needed)
        • 4. Superior (Excellent fine tip control or tool targeting even in difficult situation.)
      • 7.
        Loop reduction techniques (pull-back, external pressure, patient position change):
        • N/A. Not Assessed (ie, Fellow observed procedure only)
        • 1. Novice (Unable to reduce/avoid loops without hands-on assistance)
        • 2. Intermediate (Needs considerable coaching on when or how to perform loop reduction maneuvers)
        • 3. Advanced (Able to reduce/avoid loops with limited coaching)
        • 4. Superior (without coaching, uses appropriate ext. pressure/position changes/loop reduction techniques)
      • 8.
        What is the farthest landmark the fellow reached without any hands-on assistance?
        • N/A. fellow observed only or Procedure terminated before completion.
        • 1. Rectum, □ 2. Sigmoid, □ 3. Splenic flexure, □ 4. hepatic flexure,
        • 5. Cecum No TI attempt (Reached cecum with no attempt at TI intubation)
        • 6. Cecum Failed TI attempt (Reached cecum but Failed attempt at TI intubation)
        • 7. Terminal Ileum (Successful intubation of TI)
        • 8. Other-Post surgical anatomy encountered, fellow reached maximal intubation
      • 9.
        Adequately visualized mucosa during withdrawal
        • N/A. Not Assessed (ie, Fellow observed procedure only)
        • 1. Novice (red out much of the time, does not visualize significant portions of the mucosa or requires assistance)
        • 2. Intermediate (Able to Visualize much of the mucosa but requires direction to re-inspect missed areas)
        • 3. Advanced (Able to adequately visualize most of the mucosa without coaching)
        • 4. Superior (Good visualization around difficult corners and folds and good use of suction/cleaning techniques.)
      • 10.
        Pathology identification/interpretation:
        • N/A, Study was normal (Go to question 11)
        • 1. Novice (Poor recognition of abnormalities. Misses or cannot ID significant pathology)
        • 2. Intermediate (Recognize abnormal findings but cannot interpret “erythema”)
        • 3. Advanced (Recognizes abnormalities and correctly interprets “colitis”)
        • 4. Superior (Competent Identification and assessment. “Mild chronic appearing colitis in a pattern suggestive of UC”)
      • 10a.
        Independent polyp detection by fellow
        • N/A. No Polyps present
        • 1. None (Staff identified all polyps)
        • 2. Some (Fellow independently identified at least one polyp but not all polyps present)
        • 3. All (Fellow independently ID’ed all polyps encountered)
      • 10b.
        Accurate location of lesion/pathology:
        • 1. Novice (Unable to use landmarks to ID location in the colon, “ I don’t know”)
        • 2. Intermediate (Understands landmarks but either does not recognize or incorporate into decision making process)
        • 3. Advanced (Good understanding and recognition of landmarks but generalizes pathology location “Descending colon”)
        • 4. Superior (Very Specific about location, e.g.“Splenic Flexure region approx. 60 cm from the anal verge with a straight scope”)
      • 11.
        Interventions performed by fellow:
        • CHECK ALL THAT APPLY
        • N/A – Fellow did not perform any interventions (go to question 12)
        • Biopsy □ APC Vascular lesion ablation (AVMs)
        • Snare polypectomy □ Hemostasis (Hemoclip, electrocautery, etc)
        • Submucosal injection (Lift, Epinephrine, Tattoo) □ Other____________________________
      • 11a.
        What was the fellow’s participation in the therapeutic maneuver(s) (ability to apply tool effectively)?
        • N/A. Not Assessed (ie, Fellow observed procedure only)
        • 1. Novice (Performed with significant hands-on assistance or coaching)
        • 2. Intermediate (Performed with minor hands-on assistance or significant coaching)
        • 3. Advanced (Performed Independently with minor coaching)
        • 4. Superior (Performed independently without coaching)
      • 11b.
        What was the fellows knowledge of the therapeutic tool(s)(tool selection, knowledge of set up, cautery setting, how to employ tool)?
        • N/A. Not Assessed (ie, Fellow observed procedure only)
        • 1. Novice (Unsure of the possible tool(s) indicated or settings for pathology encountered.)
        • 2. Intermediate (Able to identify possible appropriate tool choices but not sure which would be ideal [Snare vs lift & snare])
        • 3. Advanced (Independently selects the correct tool yet needs coaching on settings)
        • 4. Superior (Independently identifies correct tool and settings as applicable.)
      Overall Assessment:
      • 12.
        The fellow’s overall hands-on skills:
        • N/A. Not Assessed (ie, Fellow observed procedure only)
        • 1. Novice (Learning basic scope advancement; requires significant assistance and coaching)
        • 2. Intermediate (Acquired basic motor skills but still requires limited hands-on assistance and/or significant coaching)
        • 3. Advanced (Able to perform independently with limited coaching and/or requires additional time to complete)
        • 4. Superior (Competent to perform routine colonoscopy independently)
      • 13.
        The fellow’s overall cognitive skills (Situational Awareness (SA)/abnormality interpretation/decision making skills):
        • N/A. Not Assessed (ie, Fellow observed procedure only)
        • 1. Novice (Needs significant prompting, correction or basic instruction by staff)
        • 2. Intermediate (Needs intermittent coaching or correction by staff)
        • 3. Advanced (Fellow has good SA, and interpretation/decision making skills)
        • 4. Superior (Competent to make interpretations and treatment decisions independently)
      INR, International normalized ratio; PMH, past medical history; TI, terminal ileum; UC, ulcerative colitis; ID, identified/identification; APC, argon plasma coagulation; AVM, arteriovenous malformation; SA, situational awareness.
      Modified from the Mayo Colonoscopy Skills Assessment Tool (© Mayo Foundation for Medical Education and Research) as reported in Sedlack RE. The Mayo Colonoscopy Skills Assessment Tool: a validation of a unique instrument to assess colonoscopy skills in trainees. Gastrointest Endosc 2010;72:1125-33. Used with permission.

      References

        • ASGE Training Committee; Sedlack RE, Coyle WJ, Obstein KL, et al
        ASGE’s assessment of competency in endoscopy evaluation tools for colonoscopy and EGD.
        Gastrointest Endosc. 2014; 79: 1-7
        • Sedlack R.E.
        • Coyle W.J.
        • ACE Research Group
        Assessment of competency in endoscopy: establishing and validating generalizable competency benchmarks for colonoscopy.
        Gastrointest Endosc. 2016; 83: 524-526
        • ASGE Training Committee; Adler DG, Bakis G, Coyle WJ, et al
        Principles of training in GI endoscopy.
        Gastrointest Endosc. 2012; 75: 231-235
        • ASGE Training Committee; Walsh CM, Umar SB, Ghassemi S, et al
        Colonoscopy core curriculum.
        Gastrointest Endosc. 2020; 93: 297-304
        • Vassiliou M.C.
        • Kaneva P.A.
        • Poulos B.K.
        • et al.
        Global Assessment of Gastrointestinal Endoscopic Skills (GAGES): a valid measurement tool for technical skills in flexible endoscopy.
        Surg Endosc. 2010; 24: 1834-1841
        • Vassiliou M.C.
        • Dunkin B.J.
        • Fried G.M.
        • et al.
        Fundamentals of endoscopic surgery: creation and validation of the hands-on test.
        Surg Endoc. 2014; 28: 704-711
        • Vassiliou M.C.
        • Kaneva P.A.
        • Poulose B.K.
        • et al.
        How should we establish the clinical case numbers required to achieve proficiency in flexible endoscopy?.
        Am J Surg. 2010; 199: 121-125
        • Kane S.V.
        • Chandrasekhara V.
        • Sedlack R.E.
        • et al.
        Credentialing for endoscopic practice: the Mayo Clinic model.
        Clin Gastroenterol Hepatol. 2018; 16: 1370-1373
        • Sedlack R.E.
        The Mayo Colonoscopy Skills Assessment Tool: validation of a unique instrument to assess colonoscopy skills in trainees.
        Gastrointest Endosc. 2010; 72: 1125-1133
        • Rex D.K.
        • Schoenfeld P.S.
        • Cohen J.
        • et al.
        Quality indicators for colonoscopy.
        Gastrointest Endosc. 2015; 81: 31-53
        • Rabeneck L.
        • Paszat L.F.
        • Saskin R.
        Endoscopist specialty is associated with incident colorectal cancer after a negative colonoscopy.
        Clin Gastroenterol Hepatol. 2010; 8: 275-279
        • Shaidi N.
        • Ou G.
        • Telford J.
        • et al.
        Establishing the learning curve for achieving competency in performing colonoscopy: a systematic review.
        Gastrointest Endosc. 2014; 80: 410-416

      Linked Article

      • Do surgery residents and gastroenterology fellows learn colonoscopy at different rates?
        Gastrointestinal EndoscopyVol. 96Issue 2
        • Preview
          For years, it seemed reasonable to assume that the hand-eye skills that surgery residents acquire during their residency, doing general and laparoscopic surgical procedures, could potentially shorten the learning curve for endoscopic procedures. After all, many surgical procedures are more complex and lengthier than a standard endoscopic procedure. Specifically for technical skill acquisition in colonoscopy, the number of cases required by surgical training programs for trainee competence has been thought to be lower in comparison with gastroenterology fellowships.
        • Full-Text
        • PDF