How well do internal medicine faculty members evaluate the clinical skills of residents?

Ann Intern Med. 1992 Nov 1;117(9):757-65. doi: 10.7326/0003-4819-117-9-757.

Abstract

Objective: To determine the accuracy of faculty evaluations of residents' clinical skills and whether a structured form and instructional videotape improve accuracy.

Design: Randomized, controlled trial.

Setting: Twelve university and community teaching hospitals.

Participants: A total of 203 faculty internists.

Interventions: Participants watched a videotape of one of two residents performing new patient workups. Participants were assigned to one of three groups: They used either an open-ended evaluation form or a structured form that prompted detailed observations; some participants used the structured form after seeing a videotape showing good evaluation techniques.

Main outcome measures: Faculty observations of strengths and weaknesses in the residents' performance were scored. An accuracy score consisting of clinical skills of critical importance for a competent history and physical examination was calculated for each participant by raters blinded to the participants' hospital, training, subspecialty, and experience as observers.

Results: When observations were not prompted, participants recorded only 30% of the residents' strengths and weaknesses; accuracy among participants using structured forms increased to 60% or greater. Faculty in university hospitals were more accurate than those in community hospitals, and general internists were more accurate than subspecialists; the structured form improved performance in all groups. However, participants disagreed markedly about the residents' overall clinical competence: Thirty-one percent assessed one resident's clinical skills as unsatisfactory or marginal, whereas 69% assessed them as satisfactory or superior; 48% assessed the other resident's clinical skills as unsatisfactory or marginal, whereas 52% assessed them as satisfactory or superior. Participants also disagreed about the residents' humanistic qualities. The instructional videotape did not improve accuracy.

Conclusions: A structured form improved the accuracy of observations of clinical skills, but faculty still disagreed in their assessments of clinical competence. If program directors are to certify residents' clinical competence, better and more standardized evaluation is needed.

Publication types

  • Clinical Trial
  • Randomized Controlled Trial
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Clinical Competence*
  • Evaluation Studies as Topic
  • Faculty, Medical*
  • Hospitals, Community
  • Hospitals, Teaching
  • Hospitals, University
  • Humans
  • Internship and Residency / standards*
  • Observer Variation
  • Reproducibility of Results
  • Surveys and Questionnaires
  • United States
  • Videotape Recording