Evaluation of laparoscopic skills: a 2-year follow-up during residency training =============================================================================== * Anna M. Derossis * Maureen Antoniuk * Gerald M. Fried ## Abstract **Objective:** To evaluate laparoscopic technical skill in surgical residents over a 2-year period. **Design:** The laparoscopic technical skills of general surgical residents were evaluated using the MISTELS program. This provides an objective evaluation of laparoscopic skill, taking into account precision and speed. **Setting:** Inanimate laparoscopic skills centre. **Participants:** Ten general surgical residents (5 PGY1, 3 PGY2 and 2 PGY3 residents) who were required to complete 3 structured laparoscopic tasks. **Outcome measures:** A composite score incorporating precision and timing was assigned to each task. The paired *t*-test was used to compare performance of each resident at the 2 levels of their residency training for each task. Linear regression analysis was used to correlate level of training and total score (sum of all tasks). **Results:** Linear regression analysis demonstrated a highly significant correlation between level of training and total score (*r* = 0.82, *p* < 0.01). There was a significant increase in scores in the cutting and suturing task over the 2-year period (*p* < 0.01). Transferring skills did not improve significantly (*p* = 0.11). **Conclusions:** Performance in the simulator improved over residency training and was correlated highly with postgraduate year. This simulator model is a valuable teaching tool for training and evaluation of basic laparoscopic tasks in laparoscopic surgery. The acquisition of surgical technical skill is an important component of residency training. Assessment of performance has also played a central role in the evaluation process. To date, the evaluation of technical skill in most academic institutions has been relegated to a few items on a check-list assessment form. Most surgical educators are convinced that more credible, reliable and valid indicators must be used to evaluate their resident’s technical skills. The performance of laparoscopy requires certain basic skills inherent in all videoendoscopic surgery, including ambidexterity, eye–hand coordination and depth perception. Performance of a laparoscopic operation subsequently requires the ability to use laparoscopic instrumentation and to perform certain operative skills (i.e., dissection, cutting, placement of an intracorporeal suture). The purpose of this study was to evaluate technical skill in laparoscopic surgery at a 2-year interval in residency training. Laparoscopic tasks were developed to objectively and quantitatively assess skill by measuring performance in a surgical simulator. ## Participants and methods The study involved 10 residents. Five PGY1 residents, 3 PGY2 residents and 2 PGY3 residents were evaluated initially then again 2 years later. Three laparoscopic tasks were evaluated at each session within a surgical simulator. An introductory video demonstrating proper performance of the exercises was shown to each candidate before testing. The simulator consists of a laparoscopic trainer box measuring 40 × 30 × 19.5 cm (USSC Laptrainer, United States Surgical Corp., Norwalk, Conn.) covered by an opaque membrane. Two 12-mm trocars (USSC Surgiport, United States Surgical Corp.) were placed through the membrane at convenient working angles on either side of the 10-mm 0° laparoscope (USSC Surgiview, United States Surgical Corp.). Four alligator clips within the simulator were used to suspend materials for certain exercises. The laparoscope and camera were mounted on a stand at a fixed focal length. This enabled the examinee to work independently. The optical system consists of the laparoscope, camera, light source and video monitor. The video monitor was placed in line with the operator. Three standardized exercises were used based on a prior McGill Laparoscopic Simulator Study of 7 tasks.1 In the McGill Laparoscopic Simulator study, each task was analysed individually by linear regression analysis. Four of 7 tasks showed a significant correlation between scores and level of residency training. Three of these tasks showing correlation were used in the present study (pegboard patterns, pattern cutting, intracorporeal suturing). Performance of each task was scored for both precision of performance and speed. For each exercise a timing component was calculated by subtracting the time to complete the exercise from a preset cutoff time (timing component = cutoff time [seconds] − time to complete the exercise [seconds]). This system rewards faster performance with higher scores. If the time to complete the exercise surpassed the preset cutoff time, a timing component of zero was given; no negative values were assigned. Precision was scored by calculating a penalty component for each exercise (see description of exercises). Finally, the score for each exercise was calculated by subtracting the penalty from the timing component. Thus, the more accurately and quickly a task was completed, the higher was the score. A total score was calculated from the sum of the scores of the 3 exercises. ### Task 1 — pegboard Through the use of 2 pegboards and 6 pegs the operator was required to lift each peg from 1 pegboard with the left hand, transfer it to the right hand, and place it on the other pegboard. This was then reversed. The aim was to test eye–hand coordination, depth perception and ambidexterity. The cutoff time was set at 300 s and a penalty calculated as the percentage of pegs that could not be transferred as a result of being dropped outside the field of view. ### Task 2 — pattern cutting This task involved cutting a 4-cm diameter pre-marked circular pattern out of a 10 × 10-cm piece of gauze suspended between alligator clips. The aim was to use the grasper in one hand placing the material under tension while cutting with the endoscopic scissors in the other hand. This task was designed to test skill in cutting with laparoscopic scissors. Cutoff time was 300 s, and the penalty was determined by calculating the percentage area of deviation from a perfect circle. ### Task 3 — intracorporeal suturing This task involved the placement of a simple suture, 13 cm long, through pre-marked points in a longitudinally slit Penrose drain. The suture was then tied using an intracorporeal knot technique. This exercise evaluated needle transferring, placement of a suture and knot tying. Cutoff time was 600 s, and a penalty was calculated to reflect the accuracy and security of the suture placed. The penalty was the sum of the distance in millimetres from the premarked points that the suture was placed and the gap in millimetres if the suture failed to approximate the slit in the Penrose drain. In addition, the security of the knot was given 0 penalty points for a secure knot, 10 points for a slipping knot and 20 points for a knot that came apart. ### Data analysis Each resident served as his or her own control. Data were analysed by a paired *t*-test to compare the performance of each resident at the 2 levels of residency training, for each exercise. A probability value of less than 0.05 was considered significant. Linear regression analysis was used to correlate level of training (postgraduate year when the testing was done) with the total score (sum of all 3 tasks). ## Results A highly significant improvement was seen for the total score over the 2-year interval (*p* = 0.0002) (Table I). When each task was evaluated individually, 2 out of 3 tasks also showed a highly significant improvement. These were cutting and intracorporeal suturing. Linear regression analysis of the total scores versus postgraduate year showed a significant correlation (*r* = 0.82, *p* = 0.0001) (Fig. 1). ![FIG. 1](http://canjsurg.ca/https://www.canjsurg.ca/content/cjs/42/4/293/F1.medium.gif) [FIG. 1](http://canjsurg.ca/content/42/4/293/F1) FIG. 1 Linear regression analysis of total scores versus postgraduate year. ***y*** = 140.06x + 190.83, ***r*** = 0.82, ***p*** = 0.0001. View this table: [Table I](http://canjsurg.ca/content/42/4/293/T1) Table I Comparison of Performance Scores (Means [and Standard Error]) at 2 Intervals During Residency Training ## Discussion Training and evaluation of surgical technical skills has been lacking despite its central importance to the surgical curriculum. Skills acquisition has been mostly limited to the operating room. This is not an ideal environment owing to time, cost constraints and medicolegal concerns. A recent statement by Ritchie on behalf of the American Board of Surgery sent an appeal to improve and broaden opportunities for the graduate education and training of surgeons. Currently, this goal has been challenged by the introduction of the business ethos and marketplace mentality into the practice of medicine and surgery.2 The fundamental responsibility to train qualified surgeons is paramount. No substitute exists for apprenticeship training in the operating room; however, acquisition of technical skill is not unidimensional, and structured teaching allows time for understanding. 3 There is no need to drop conventional methods of training and evaluation, but rather we may modify and expand on these. The Objective Structured Clinical Examination (OSCE), widely used for medical student evaluation, has been implemented as a tool for clinical evaluation in surgery.4,5 Sloan and colleagues,6 in a 38-station OSCE with 56 surgical residents, found the OSCE very reliable, and the performance varied significantly according to level of training. Standardized tests of technical skill for open procedures were developed as the Objective Structured Assessment of Technical Skill (OSATS). Reznick and colleagues7 demonstrated high reliability and construct validity of their bench model simulations. Our study evaluated laparoscopic skill through standardized exercises. The scoring system enabled unbiased objective evaluation with ease. The administration of the laparoscopic simulator curriculum is simple, inexpensive and portable and requires limited training. Assessing laparoscopic skill at a 2-year interval in residency training was performed in a surgical simulator. Residents’ mean score improved significantly over the interval, and this was independent of their level of training when first evaluated. When looking at each of the 3 exercises individually, we found that the residents demonstrated significant improvement in the cutting and suturing exercises. Improvement in the pegboard exercise did not reach statistical significance (*p* = 0.11). Perhaps this reflects that eye–hand coordination that is already acquired does not necessarily improve significantly over 2 years, or that the number of trainees evaluated was inadequate. Construct validity was demonstrated by observing improvement in total score as the resident advanced in training. By linear regression analysis there was a significant correlation between level of training and total score. ## Conclusions Objective, structured criteria for evaluation provide reliable feedback. This feedback becomes increasingly accurate and objective. It also provides comparison for progress. A laparoscopic skills evaluation such as this can serve as an adjunct to the present evaluation of technical skill in in-training evaluations. ## Acknowledgments This work was supported by an educational grant from United States Surgical Corporation (Auto Suture Canada), an equipment grant from Storz Endoscopy, Canada and a grant from the Steinberg–Bernstein Foundation for Video-endoscopic Surgery. ## Footnotes * This work was presented at the annual meeting of the Society of American Gastrointestinal Endoscopic Surgeons, April 1998. * Accepted January 4, 1999. ## References 1. Derossis AM, Fried GM, Abrahamowicz M, Sigman HH, Barkun JS, Meakins JL. Development of a model for training and evaluation of laparoscopic skills. Am J Surg 1998;175(6):482–7. [CrossRef](http://canjsurg.ca/lookup/external-ref?access_num=10.1016/S0002-9610(98)00080-4&link_type=DOI) [PubMed](http://canjsurg.ca/lookup/external-ref?access_num=9645777&link_type=MED&atom=%2Fcjs%2F42%2F4%2F293.atom) [Web of Science](http://canjsurg.ca/lookup/external-ref?access_num=000074154600012&link_type=ISI) 2. Ritchie WP Jr.. Graduate surgical education in the era of managed care: a statement from the American Board of Surgery. J Am Coll Surg 1997;184(3):311–2. [PubMed](http://canjsurg.ca/lookup/external-ref?access_num=9060931&link_type=MED&atom=%2Fcjs%2F42%2F4%2F293.atom) 3. Reznick RK. Teaching and testing technical skills. Am J Surg 1993;165(3):358–61. [CrossRef](http://canjsurg.ca/lookup/external-ref?access_num=10.1016/S0002-9610(05)80843-8&link_type=DOI) [PubMed](http://canjsurg.ca/lookup/external-ref?access_num=8447543&link_type=MED&atom=%2Fcjs%2F42%2F4%2F293.atom) [Web of Science](http://canjsurg.ca/lookup/external-ref?access_num=A1993KQ65900015&link_type=ISI) 4. Sloan DA, Donnelly MB, Johnson SB, Schwartz RW, Strodel WE. Use of an Objective Structured Clinical Examination (OSCE) to measure improvement in clinical competence during the surgical internship. Surgery 1993;114(2):343–51. [PubMed](http://canjsurg.ca/lookup/external-ref?access_num=8342135&link_type=MED&atom=%2Fcjs%2F42%2F4%2F293.atom) [Web of Science](http://canjsurg.ca/lookup/external-ref?access_num=A1993LQ38400027&link_type=ISI) 5. Cohen R, Reznick RK, Taylor BR, Provan J, Rothman A. Reliability and validity of the objective structured clinical examination in assessing surgical residents. Am J Surg 1990;160(3): 302–5. [CrossRef](http://canjsurg.ca/lookup/external-ref?access_num=10.1016/S0002-9610(06)80029-2&link_type=DOI) [PubMed](http://canjsurg.ca/lookup/external-ref?access_num=2393060&link_type=MED&atom=%2Fcjs%2F42%2F4%2F293.atom) [Web of Science](http://canjsurg.ca/lookup/external-ref?access_num=A1990DX53200018&link_type=ISI) 6. Sloan DA, Donnelly MB, Schwartz RW, Strodel WE. The Objective Structured Clinical Examination. The new gold standard for evaluating postgraduate clinical performance. Ann Surg 1995;222(6):735–42. [CrossRef](http://canjsurg.ca/lookup/external-ref?access_num=10.1097/00000658-199512000-00007&link_type=DOI) [PubMed](http://canjsurg.ca/lookup/external-ref?access_num=8526580&link_type=MED&atom=%2Fcjs%2F42%2F4%2F293.atom) [Web of Science](http://canjsurg.ca/lookup/external-ref?access_num=A1995TJ31400007&link_type=ISI) 7. Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg 1997; 173(3):226–30. [CrossRef](http://canjsurg.ca/lookup/external-ref?access_num=10.1016/S0002-9610(97)89597-9&link_type=DOI) [PubMed](http://canjsurg.ca/lookup/external-ref?access_num=9124632&link_type=MED&atom=%2Fcjs%2F42%2F4%2F293.atom) [Web of Science](http://canjsurg.ca/lookup/external-ref?access_num=A1997WR00800018&link_type=ISI)