Abstract
Background: National meetings such as those of the American Academy of Orthopædic Surgeons (AAOS) and the Canadian Orthopædic Association (COA) are invaluable in the dissemination of new research findings. Given the limits of meeting agendas, investigators who present the same paper at multiple meetings prevent other presentations on potentially important original research. To determine the incidence of duplicate presentation of research between recent COA and AAOS meetings and between national meetings (AAOS and subspecialty), we conducted an observational study.
Methods: We hand-searched all podium papers and posters from the 2001 COA annual meeting for duplicate presentation at the 2001 and 2002 AAOS annual meetings and subspecialty meetings held in the USA. We evaluated summary data abstracted from the duplicate presentations for consistency.
Results: Of 148 presentations at the 2001 COA meeting, 29 presentations (paper and poster) were duplicated at the 2001 or 2002 AAOS meeting: effectively 1 paper in 5 (19.5%). Canadian investigators were significantly more likely to present the same paper at both meetings than Americans (79% v. 13%, respectively; p < 0.01). Those who presented papers at COA altered their AAOS presentations in a variety of ways: by changing the wording in the title of their paper (24% of the time), adding or removing authors (38%), changing authorship order (34%) and changing the sample size (31%). Duplicate presentation rates between AAOS and other orthopedic subspecialty meetings averaged 11.4% (range 3.4%–26.4%).
Conclusions: We identified a 20% duplicate presentation rate between the COA and AAOS annual meetings, and an 11% rate between the AAOS and subspecialty meetings. Stricter enforcement of guidelines and improved dissemination of research findings at both national meetings may limit this practice.
National meetings are invaluable in facilitating the dissemination of new research findings to a wide audience. The annual meetings of the American Academy of Orthopædic Surgeons (AAOS) and the Canadian Orthopædic Association (COA) are 2 nationally prominent meetings that serve as platforms for the timely presentation of current research findings. AAOS typically receives over 3200 abstracts, and COA, over 375 for a given meeting. Given their respective agendas, time for presentation of original research is limited, and only a small proportion of submitted research papers (27% at AAOS and 40% at COA) can ultimately be selected for either podium or poster presentation.
Clearly, the goal of most researchers is to publish the results of their investigation in the highest-impact peer-reviewed journals; however the peer review, editorial and publishing process at these leading journals results in a lag of up to 2 years between completion of the study and circulation of the printed version in a journal. Nationally prominent meetings such as AAOS and COA provide an opportunity to communicate important research findings to the orthopedic community in a timely manner, receive valuable feedback from peers and enable the submission of a higher-quality manuscript to a peer-reviewed journal.
Given the limited opportunity for investigators to present their work, presenting the same paper at more than 1 meeting has several potential disadvantages: it prevents the dissemination of other potentially important studies, denies other investigators the same opportunity to receive valuable feedback for their current and future studies, and risks providing a program in which attendees have already seen the papers and posters, which are no longer new information.
Advocates of duplicate presentation may believe this practice is necessary to ensure a wider audience than would be available at either meeting alone. This argument is based on a belief that audiences at AAOS and COA meetings are different. However, this assumption may not hold between AAOS and subspecialty orthopedic meetings held in the USA.
The extent of duplicate presentation of podium papers and posters between AAOS and COA remains unknown. Additionally, rates of duplicate presentation between AAOS and subspecialty meetings remain largely unreported. Therefore, we conducted an observational study to determine the incidence of duplicate presentation of research between recent meetings of COA and AAOS. We also aimed to determine whether rates of duplicate presentation were similar between AAOS and subspecialty orthopedic meetings within the same country.
Methods
Eligibility criteria
We included all podium papers and posters presented at the 56th Annual Meeting of the Canadian Orthopædic Association in London, Ont., June 1–4, 2001 (to which we refer in general as COA). COA served as the reference meeting. Searches for duplicate presentations focused upon the 68th Annual Meeting of the American Academy of Orthopædic Surgeons, San Francisco, Calif., Feb. 28–March 4, 2001, and their 69th in Dallas, Tex., Feb. 13–17, 2002. We assumed that any duplicate presentation from COA at AAOS would have occurred in the same or immediately following year (Fig. 1).
The 2001 AAOS meeting also served as the reference meeting to identify duplicate presentations at annual meetings (2000 and 2001) of 3 subspecialty societies: the Orthopædic Trauma Association (OTA), American Association of Hip and Knee Surgeons (AAHKS) and Arthroscopy Association of North America (AANA).
Identification of duplicate podium or poster presentations
A search of the 2001 COA program was conducted by 2 of us by hand, to identify all eligible research papers presented as podium papers or posters. Because (we reasoned) at least 1 (co)author would be associated with a duplicate presentation, we cross-referenced every COA author with the 2001 (288 podium, 481 posters) and 2002 (320 podium, 545 posters) AAOS meeting programs to identify any potential duplicates. Only those presentations based on the same paper were included for further consideration. All potential duplicate presentations (i.e., same author and title, similar title, similar topic) were retrieved and reviewed in detail by 3 of us for a final determination of eligibility.
By means of the Internet, we also searched the PubMed database to ascertain if any of the duplicate presented articles had been published. Each author’s name was cross-checked against title and abstract to detect subsequent publications.
We used the same approach to identify duplicate presentations between the 2001 AAOS meeting and 3 subspecialty society meetings in 2000 and 2001: OTA (187 podium, 134 posters), AAHKS (109 podium, 88 posters) and AANA (137 podium, 125 posters).
Data collection from duplicate presentations
Characteristics of study, as reported in the study abstract
From each abstract, 2 of us abstracted the number of authors, geographic research location, number of centres, orthopedic subspecialty, study design (randomized trial, observational or basic science study), sample size, funding disclaimers and direction of results (positive or negative outcome).
Consistency in reporting between presentations
Two of us abstracted information relating to consistency between duplicate presentations, including differences in the study title, number and order of authors, presentation of all outcomes, discrepancy in the study objective/hypothesis, study design, primary outcome measure, sample size and study results. Any difference in study results between presentations at both meetings constituted an inconsistency.
Agreement among reviewers in abstracting data
To ensure consistency in identification, application of eligibility criteria and data abstraction, we conducted assessments in duplicate or triplicate.
The κ statistic was used as a measure of interobserver agreement, from +1 (perfect agreement) to −1 (absolute disagreement).1 A κ value of 0 represents an agreement no better than what could occur by chance alone. Because multiple observers were involved in this study, κ was calculated according to the method described by Fleiss1 and interpreted according to the guidelines proposed by Landis and Koch.2 These guidelines suggest that κ values of 0–0.2 represent slight agreement, 0.21–0.40 fair, 0.41–0.60 moderate and 0.61–0.80 substantial agreement; values > 0.80 were considered near-perfect agreement. For each κ value, we also calculated the 95% confidence interval (CI).
All discrepancies were resolved by consensus.
Database development and data entry
We created a database using SPSS statistical software (version 10). Logic checks built into the database further limited errors in data entry; for example, the program flagged numbers other than those specified for a particular question. One coauthor entered all data, and another checked it for accuracy. An independent author did a final visual check, reviewing each database cell for inconsistent or missing data.
Statistical analysis
Categorical variables (country of research, orthopedic subspecialty, etc.) are presented as proportions with 95% CIs; continuous variables (e.g., sample size), as means or medians with standard deviations. Differences between continuous variables were compared with Student’s t test, and dichotomous variables with a χ2 test. We explored whether differences between AAOS presentations were distributed similarly at the 2001 and 2002 AAOS meetings by using cross-tabulations. A p value of 0.05 or less was considered statistically significant.
Results
Identification of duplicate presentations
At the 2001 COA annual meeting 148 papers were presented, of which 84 were podium presentations and 64 poster presentations. The initial review of the 2001 and 2002 AAOS programs revealed 34 potentially duplicate presentations. After detailed review of the titles, authors and abstracts, 5 papers were excluded, leaving 29 duplicate presentations for consideration. Agreement among reviewers for identifying duplicate presentations was high (κ = 0.83, CI 0.72–0.94).
The duplicate presentation rate from AAOS was 19.5% (29/149). Effectively 1 in 5 papers presented at AAOS in 2001 (23 podium, 6 posters) was presented again at COA in 2001 (11/29) or 2002 (18/29). No disclaimer indicating previous presentation appeared in any 2001 or 2002 COA or AAOS program. With an Internet PubMed search we identified 4 articles (13.8%) published in 4 different journals to date: Journal of Bone and Joint Surgery, American volume and British volume; Journal of Arthroplasty; and Clinical Orthopædics and Related Research. Of the 4 articles, 3 focused upon hip arthroplasty and 1 on methodology.
Our review of duplicate presentations between AAOS and subspecialty meetings revealed rates of 3.4% to 31.3%.
Characteristics of the duplicate presentations
Although the majority of COA presentations (17/29, 59%) were repeated in the same format (poster to poster or podium to podium), more than one-third of COA podium presentations (38%) were shown at AAOS as posters (Table 1). Of the duplicate presentations, 85% focused upon hip or knee surgery and trauma; only 13% were randomized trials.
Canadian investigators were significantly more likely to present the same paper at both meetings (79%) than American investigators (13%; p < 0.01). In addition, 20 of the 29 duplicate papers (69%) occurred in 2 Canadian centres (centre A, 11 papers; centre B, 9 papers).
In our review of 3 major orthopedic subspecialty meetings, duplicate presentations were significantly more common between AAOS and OTA (26.4%) than between AAOS and either AAHKS (3.4%) or AANA meetings (4.4%; p < 0.01; Table 2).
Consistency between duplicate presentations
Study information that was presented at both COA and AAOS (Table 3) was highly consistent in study objectives, design, outcomes and direction of results (Table 4). Investigators who presented papers at AAOS altered their AAOS presentations by adding or removing authors 38% of the time, changing authorship order 34% of the time, changing the sample size 31% of the time and changing the wording in the title of their paper 24% of the time. For example, 1 paper focusing upon multidirectional instability of the shoulder was presented with 5 authors at COA and 4 authors at AAOS in the same year: despite having the same content, 2 authors were removed and 1 new author was added between COA and AAOS.
We explored whether differences between presentations at AAOS were distributed similarly in the 2001 and 2002 AAOS meetings. Order of authorship was changed in a significantly greater proportion of papers presented at the 2002 AAOS (83%) than at the 2001 AAOS meeting (36%; p = 0.02). Changes in authorship (additions, removals or both) in presentations occurred with similar frequency at both AAOS meetings: 45% in 2001 and 34% in 2002 (p = 0.65).
Discussion and conclusions
In our study, we found that 1 in 5 abstracts were presented in duplicate at COA and AAOS meetings; that 1 in 10 abstracts were presented in duplicate between AAOS and subspecialty meetings; and that frequent changes in AAOS presentations included changes to the study title or authorship.
Whereas the double data abstraction and assessment of reliability strengthens the inferences that can be made from this study, our findings may not be generalizable to meetings outside of COA, AAOS, OTA, AAHKS and AANA. Our decision to use the annual COA and AAOS meetings was based solely upon their status as the most notable national meetings in Canada and the USA. In addition, we believed that trauma, hip and knee surgery and sports medicine constituted a suitable sample of subspecialties to provide an informed estimate of duplicate presentation rates.
Although several studies have reported publication rates following specialty meetings,3–13 little has been written on duplicate presentation at multiple meetings. Abstract submission guidelines for AAOS and subspecialty meetings alike require researchers to report whether their work has been previously presented or published. We are uncertain how 1 in 5 abstracts were presented at both the COA and AAOS meetings.
Several reasons for duplicate presentation at AAOS remain:
Investigators neglected to report prior presentation on the abstract submission form.
Overlaps between the submission of abstracts and notification of acceptance between the 2 meetings (Fig. 1) did not enable investigators to disclose prior presentation.
Changes in abstract titles and authors modified the appearance of abstracts and permitted them to pass through review.
Reviewers may have decided that certain abstracts were worthy of re-presentation.
We contacted the executive officer responsible for annual COA meeting programs to discuss policies about duplicate presentation. Currently, there is no formal policy prohibiting the submission of papers that have been published or presented elsewhere. Neither does COA require authors to disclose prior publication or presentation, although COA does hope that authors will be forthcoming with this information. Acceptance of such papers is currently at the discretion of the Program Chair for each year.
AAOS, on the other hand, requests during its abstract submission process that authors disclose whether their work was previously published or presented at another meeting.
Based upon typical abstract submission times (Fig. 1), it remains plausible that an investigator who submits a paper for the next year’s COA meeting in October will not have received a response from AAOS about the same abstract submitted in March–April of the same year.
Inconsistencies between orthopedic presentations and subsequent full-text publications have previously been reported.13 The current research is the first to report inconsistencies between papers presented at multiple meetings. Investigators who presented papers at COA altered their AAOS presentations by changing the wording in the title of their paper 24% of the time, by adding or removing authors 38% of the time, or changing authorship order 34% of the time. Perhaps subtle changes in the study title, the addition or removal of 1–3 authors, or a change in authorship order has led AAOS reviewers to believe the submitted study was an extension of previously presented work.
Advocates of duplicate presentation may argue that papers deemed important in the field should receive the broadest audience for dissemination of their findings. If this were so, we would expect that such work would be published in peer-reviewed journals. Our review of the literature identified only 14% of duplicate presentations (4/29) that, to date, have subsequently achieved journal publication. Publication rates of abstracts from international meetings, which have reportedly ranged from 11% to 68%,3–13 suggest that as a group, the duplicate presentations studied were no different from other papers presented at both meetings. Several factors have recently been identified14 as barriers to publication after presentation at international meetings: perceived quality of the research, limits to available time, the responsibilities of manuscript preparation, difficulties with coauthors, and studies that are still in progress.
The finding that 79% of duplicate COA/AAOS presentations (23/29) occurred among Canadian investigators likely represents the paucity of US–based abstract submissions to COA. It may also reflect a belief among Canadian researchers that presenting their work only at COA may limit the international dissemination of their findings. However, duplicate presentation was not solely a phenomenon occurring between nations. It also occurred within a single nation, as evidenced by a mean 11% rate of duplicate presentation between AAOS and US–based subspecialty meetings.
Whether duplicate presentations should be limited or supported remains debatable. In our study, 66 investigators were denied the opportunity to present their research findings at AAOS, COA or subspecialty meetings due to multiple presentation of the same paper at both meetings. Potential strategies to limit duplicate presentation may require increased communication between the organizers of both national meetings. One potential solution may be to allow investigators to present research findings at 1 national meeting only. At the time of abstract submission, investigators may be asked to report whether they have submitted work to other meetings for consideration of presentation. Those that disclose submission to other meetings may then be required to choose only 1 meeting for presentation, allowing sufficient time for organizers to contact other investigators. If duplicate presentation is allowed, meeting organizers should enforce disclosure and report such occurrences in the program booklet.
An alternative approach that may prevail could be a compromise between the strict rejection of duplicate papers versus welcoming them. Papers deemed by program committee members to be of international interest and merit further dissemination may overrule a duplicate presentation policy, whereas less “meritorious” duplicate papers may be rejected to allow other authors an opportunity to present (and disseminate) their work.
In conclusion, we identified a 20% duplicate presentation rate between COA and AAOS annual meetings and an average 11% duplicate presentation rate between AAOS and subspecialty meetings. Stricter enforcement of guidelines and improved dissemination of research findings at both national meetings may limit the occurrence of duplicate presentations. A better understanding of attitudes toward duplicate presentation, not only among researchers but also meeting organizers and program committees, may help provide answers to some of the questions around this issue.
Footnotes
Competing interests: Thammi Ramanan was funded by the Surgical Outcomes Research Centre, Hamilton, Ont. Dr. P.J. Devereaux was funded by a Heart and Stroke Foundation of Canada–Canadian Institutes of Health Research Award.
- Accepted October 24, 2003.