Original Research
Resident Research Productivity: Does Being a Prolific Medical Student Correlate with Research Productivity During Orthopaedic Residency?
Todd Heig MD¹; Aryan Rezvani, BS³; Kiya S. Safavi, MD²; Trevor Tompane, MD¹; Cory Janney, MD¹
¹Department of Orthopaedic Surgery, Naval Medical Center; San Diego, San Diego, CA, USA²Texas A&M College of Medicine; College Station, TX, USA
³School of Medicine, The University of Texas Medical Branch; Galveston, TX, USA
Corresponding Author:Cory Janney, MD, Department of Orthopaedic Surgery, Naval Medical Center, 34800 Bob Wilson Dr., San Diego CA 92134 USA; e-mail: cory.f.janney.mil@health.mil
DOI: 10.18600/toj.080102
ABSTRACT
Background: Step 1 score elimination creates a data void for predicting the research productivity of orthopaedic surgery residents. This study determines whether research accomplishments and standardized scores of residency applicants correlate with research productivity during residency or acceptance to top-25 NIH-funded orthopaedic residency programs. Methods: 418 ERAS applications from 2020 to 2021 were reviewed. Pre-residency research activity, medical school and residency program NIH funding, publications, and Step scores were assessed.Results: The average medical school research events was 3.3, publications/presentations were 5.6, PubMed-indexed publications before residency was 1.3, during residency 3.2, and average Step 1 score was 242. Poor-to-fair correlations were between publications in residency and research involvement (ρ=0.23), total publications in medical school (ρ=0.38), and PubMed-indexed publications while in medical school (ρ=0.31). Also, poor correlation was between acceptance to a top-25 orthopaedic residencies and medical school research experience (ρ=0.15) or total publications during medical school (ρ=0.17). Regression analysis revealed significant associations between medical school research productivity, but with small effect size and poor predictive model fit.Conclusion: The work volume by a medical student and high NIH-funding school attendance correlate poorly with residency research output.Level of Evidence: IV; Retrospective data analysis.Keywords: Orthopaedic residency; Resident research; Residency application.
Background: Step 1 score elimination creates a data void for predicting the research productivity of orthopaedic surgery residents. This study determines whether research accomplishments and standardized scores of residency applicants correlate with research productivity during residency or acceptance to top-25 NIH-funded orthopaedic residency programs. Methods: 418 ERAS applications from 2020 to 2021 were reviewed. Pre-residency research activity, medical school and residency program NIH funding, publications, and Step scores were assessed.Results: The average medical school research events was 3.3, publications/presentations were 5.6, PubMed-indexed publications before residency was 1.3, during residency 3.2, and average Step 1 score was 242. Poor-to-fair correlations were between publications in residency and research involvement (ρ=0.23), total publications in medical school (ρ=0.38), and PubMed-indexed publications while in medical school (ρ=0.31). Also, poor correlation was between acceptance to a top-25 orthopaedic residencies and medical school research experience (ρ=0.15) or total publications during medical school (ρ=0.17). Regression analysis revealed significant associations between medical school research productivity, but with small effect size and poor predictive model fit.Conclusion: The work volume by a medical student and high NIH-funding school attendance correlate poorly with residency research output.Level of Evidence: IV; Retrospective data analysis.Keywords: Orthopaedic residency; Resident research; Residency application.
INTRODUCTION
Each year, the orthopaedic surgery match continues to be extremely competitive. The most recent report published by the National Resident Matching Program (NRMP) [1], detailed the statistics of applicants from 2020. Of the 804 applicants, 645 matched (80.2%). In 2021, the average United States Medical Licensing Exam (USMLE) Step 1 score was 248 for matched applicants and 239 for unmatched applicants. These data was consistent with previous reports from 2016 and 2018. Additionally, 40.3% were members of the Alpha Omega Alpha (AOA) honor society and averaged 14.3 abstracts, publications, and presentations, and 33.6% graduated from one of the top 40 United States medical schools with the highest NIH funding [1-3]. All of this continues to emphasize how competitive acceptance for allopathic applicants to orthopaedic surgery continues to be.
Each year, residency programs are tasked with reviewing applications, interviewing applicants, and ultimately offering positions to graduating medical students. When deciding which applicants are most suitable for a particular program, both objective and subjective variables are taken into consideration. Students are evaluated objectively based on Step scores, participation in research, class rank, and AOA membership, and subjectively through personal statements, letters of recommendation, performance on rotations, and interviews. These metrics continue to increase every year as the quality of applicants and the requirements to successfully match increase [1-3]. As a result, obtaining a residency position becomes increasingly competitive.
Given that an applicant’s Step 1 score directly and indirectly affects their chances of matching, much emphasis has been placed on Step 1 by both applicants and residency programs. Programs commonly set a minimum threshold on Step 1 scores when considering applicants. These thresholds are often significantly above the national mean score. Therefore, an applicant’s score can directly impact their likelihood of matching by affecting their interview eligibility. Additionally, many medical schools heavily consider Step 1 scores when nominating AOA members.
The USMLE Invitational Conference on USMLE Scoring (InCUS) met in Fall 2019 to evaluate USMLE Step 1 scoring [4]. The outcome was that, beginning in 2022, Step 1 will only be reported as Pass/Fail. This change was made to reduce overemphasis of Step 1 performance, while retaining the ability of medical licensing authorities to use the examination for its primary purpose of medical licensure eligibility [4]. By necessity, this change will place greater emphasis on other metrics that programs use to screen and evaluate applicants.
The purpose of this study is to identify whether those metrics provide adequate data that can be used for screening or evaluation, and to assess whether research productivity prior to residency, Step scores, or attending a top-25 NIH-funded medical school or residency program correlate with research productivity during residency. We hypothesize that there would be a poor correlation between research productivity and all other objective metrics.
METHODS
This study was performed between 2020 to 2021. The Naval Medical Center San Diego Institutional Review Board reviewed the protocol and determined that this work was exempt.
The 2015 Electronic Residency Application Service (ERAS) applications of 418 orthopaedic surgery residency applicants were reviewed. Two hundred and ten applicants were verified as having matched to a United States orthopaedic surgery residency and were included in the analysis. These applications were used to tabulate research productivity before residency, in the form of PubMed-indexed publications, pending publications, textbook chapters, poster and oral presentations, other publications, and research experiences. USMLE Step 1 and Step 2 scores were collected for each applicant, when available, and the NIH funding [5] of each applicant’s respective medical school and residency programs were included in the data analysis.
The PubMed database was searched for authorship by each applicant, all of whom were in their fifth postgraduate year (PGY5) at the time of the study. In cases where the applicant’s name was common and there was a question of authorship, 1 of 2 authors (TH or CJ) attempted to correlate the publication with either the applicant’s institution or research experiences, as listed on their residency application. If it could not be verified, the applicant did not receive credit for the article. If the publication occurred prior to graduation, or was included within their ERAS application, it was counted as occurring prior to residency. If it was published after graduation, from an institution associated with the applicant’s residency, and/or not included in their ERAS application, it was credited as academic productivity during residency.
Summary statistics were tabulated according to each variable. Considering the non-normal distribution of many variables in the data set, Spearman’s rank correlation coefficients were calculated to determine the degree of association between each variable. For those variables collected as count data, Poisson regression was used to measure the degree of association between each separate type of research experience in medical school and publication during residency training. Finally, a multivariate regression model was created based on detected associations between each potential predictor and academic publications during residency.
RESULTS
Of the 418 orthopaedic surgery residency applicants that were reviewed, 210 applicants matched to a United States orthopaedic surgery residency and were included in the analysis. Of these, 200 (95.2%) received a degree at an allopathic medical school and 10 (4.8%) received a degree at an osteopathic medical school. Twenty-nine students (13.8%) attended a medical school ranked in the top 25 for NIH funding, and 45 (19.0%) attended an orthopaedic residency program similarly ranked in the top 25 for NIH funding. Mean USMLE scores for the first and second parts of the examination for the cohort of applicants were 242 and 249, respectively.
The average number of pre-residency research experiences across applicants to orthopaedic residency was 3.3 (SD=1.9). The various works produced from those experiences included a mean total of 5.6 (SD=6.3) publications or presentations. Mean number of PubMed-indexed publications before residency was 1.3 (SD=2.2). During residency, the same cohort was responsible for an average of 3.2 (SD=5.3) PubMed-indexed publications. Summary statistics are presented in Table 1. Notably, the total research experiences during medical school and number of PubMed-indexed publications during medical school and residency exhibited non-normal distributions when tested for skewness.
Spearman’s correlation statistics revealed poor-to-fair correlations between PubMed-indexed production in residency and medical school research involvement (ρ=0.23), total publications in medical school (ρ=0.38), and PubMed-indexed publications while in medical school (ρ=0.31). The same analysis demonstrated poor correlation between acceptance to a top-25 NIH-funded orthopaedic residency program and medical school research experience (ρ=0.15) or total publications during medical school (ρ=0.17).
Poisson regression analysis revealed limited associations with research productivity during residency. Research experience during medical school (β=0.14, P<0.001), total publications or presentations (β=0.05, P<0.001), and PubMed-indexed publications during medical school (β=0.13, P<0.001) were associated with productivity in residency, but with very poor fit of the respective univariate logistic regression models. USMLE Step 1 (β=0.01, P<0.001) and Step 2 scores (β=0.01, P<0.001) were also somewhat associated with residency research productivity, but again with poorly fit models and low coefficient values. Attending a top-25 NIH-funded medical school was not associated with research productivity in residency; however, attending a top-25 NIH-funded residency program did lead to increased PubMed-indexed output (β=0.53, P<0.001). Univariate regression results are presented in Table 2.
Finally, a multivariate logistic regression model was constructed using a binary variable for research productivity in residency as the dependent variable (i.e. residents were categorized as either publishing or not publishing during residency), with independent variables including attendance at a top-25 NIH-funded medical school or residency program, PubMed-indexed publication in medical school, and research experience in medical school. This model revealed a negative, but insignificant, correlation between attendance at institutions funded by the NIH and publication during residency, and no correlation with research experiences during medical school. There was a positive correlation with PubMed-indexed publication during medical school (odds ratio 1.48, P=0.008), but the model was poorly fit to the cohort data (R=0.05). The multivariate model results are summarized in Table 3.
DISCUSSION
In 2016, the American Orthopaedic Association’s Council of Orthopaedic Residency Directors highlighted a critical issue: to identify and present best practices for the general recruitment of residents. During this meeting, a survey of program directors noted the following from 99 respondents: for the preceding 6 years, 77% of respondents placed 1 or more residents on probation or remediation, 48% placed 2 or more, and 40% terminated a resident, 12 of which terminated more than 1 resident [6]. Given that the vast majority of programs accept fewer than 8 residents annually, the importance of identifying capable applicants and ensuring the correct residents are matched to the appropriate programs is of great importance.
In 2020, there were 849 orthopaedic residency positions offered and 1177 applicants. Of those who matched, the median number of contiguous applications was 12. This equates to approximately 14,124 applications among those who matched. Despite this, there were only 1.41 applicants per position, and among United States medical school seniors, 1.01 applicants per position [1]. Given the ease of applying through ERAS, the onus of winnowing this vast pool of applications lies with the programs.
Campbell et al performed a literature search for publications made by members of the 2013-2014 intern class based on ACGME’s education website and found that matched applicants at research-focused institutions tended to have more publications than those at other programs [7]. Our own data supports this, as attendance at a top-25 NIH-funded residency significantly correlated with an increase in PubMed-indexed publications compared to attendance in an NIH-funded residency not in the top 25. Additionally, our analysis also showed poor correlation between research during medical school and attendance in a top-25 NIH-funded residency. Therefore, a large contributor to resident research productivity is, in fact, the program itself. This is further supported by Torres et al, who compared resident research production prior to and after a dedicated research program was developed. They found that a dedicated research program resulted in more publications (1.25 per resident vs 0.55) [8]. Recognizing this, the Accreditation Council for Graduate Medical Education on Orthopaedic Surgery made it a requirement for programs to include 60 days of dedicated research time during residency [9].
In the ERAS application, there are few objective measures of an applicant. These include the USMLE Step 1 and USMLE Step 2 clinical knowledge scores, number of research experiences, publications, presentations and posters, and membership in AOA. Our data reveals that, while research participation is an important part of an application, it does not serve as an easy metric like Step scores to quickly narrow an applicant pool. However, given the elimination of numerical Step 1 scores, these research endeavors will likely be weighed more when evaluating future applicants.
Our study has limitations. First, these are applications submitted through ERAS to a single program, which introduces selection bias into the sample. However, the sample size is relatively large and, given the modern practice of applying to 12 or more programs for those who matched, our applicant pool is reasonably representative of the applicant pool at large. Indeed, 418 applicants in 2016 would comprise 40.4% (418 of 1034) of the total applicant pool, and 210 matched applicants would comprise 29% (201 of 717) of the matched applicant pool. Second, we are relying on the ERAS applications for all pre-residency research data, with the exception of PubMed-indexed publications, which depends on the applicant’s honesty. Dale et al noted in 1999 that 18% (14 of 76) of citations were considered misrepresentations, with 17% (11 of 64) of applicants responsible for inaccurate bibliographies [10]. Others demonstrated similar results and found the incidence of publication misrepresentation did not correlate with other measures of an applicant’s academic performance (ie, AOA status, Step scores, etc.) [11]. However, this is not isolated to either orthopaedic surgery or the United States, as publication misrepresentation has been identified in a number of subspecialties and countries [12-15]. This further emphasizes the importance of applicant review and selection.
Third, the orthopaedic program used as a source for data does not have a research year and is not a top-25 NIH-funded residency. This introduces additional bias, as it does not account for how the applicant may be different. However, Krueger et al reported in 2016 on 3 programs, 1 with no protected research time, 1 with optional time, and 1 with a mandatory year, and found no significant difference in the quality and quantity of research produced between the 3 programs, indicating that productivity is more based on the faculty and program rather than protected research time [16]. A large-scale, multi-program inquiry would be required to adequately control for program differences, but, as detailed above, we believe we have a reasonable cross section that should decrease inherent bias and give a representative view of applicants and research productivity.
CONCLUSIONS
With the transition of USMLE Step 1 to a Pass/Fail metric, greater emphasis will be placed on other objective measures of performance during the application process. Step 2 clinical knowledge will likely be used as a surrogate objective score, along with membership in AOA and research productivity. Although we found pre-residency research productivity to be somewhat correlated with residency research productivity, we urge caution, as the only significant correlation for research productivity during residency was NIH funding of the residency program.
REFERENCES
[1] National Resident Matching Program. Charting outcomes in the match: senior students of U.S. MD medical schools. 2nd ed. Washington, DC: National Resident Matching Program; 2020.
[2] National Resident Matching Program. Charting outcomes in the match: U.S. allopathic seniors. 2nd ed. Washington, D.C.: National Resident Matching Program; 2018.
[3] National Resident Matching Program. Charting outcomes in the match for U.S. allopathic seniors. 1st ed. Washington, DC: National Resident Matching Program; 2016.
[4] InCUS Planning Committee. Summary report and preliminary recommendations from the Invitational Conference on USLME Scoring (InCUS), March 11-12, 2019.
[5] Roskoski RJ, Parslow TG. Ranking tables of NIH funding to US medical schools in 2020: Blue Ridge Institute for Medical Research; 202. Available at: http://www.brimr.org/NIH_Awards/2020/default.htm. Accessed on Sept 2020.
[6] Porter SE, Razi AE, Ramsey TB. Novel strategies to improve resident selection by improving cultural fit: AOA critical issues. J Bone Joint Surg Am. 2017;99(22):e120.
[7] Campbell ST, Gupta R, Avedian RS. The effect of applicant publication volume on the orthopaedic residency match. J Surg Educ. 2016;73(3):490-5.
[8] Torres D, Gugala Z, Lindsey RW. A dedicated research program increases the quantity and quality of orthopaedic resident publications. Clin Orthop Relat Res. 2015;473(4):1515-21.
[9] Accreditation Council for Graduate Medical Education. ACGME program requirements for graduate medical education in orthopaedic surgery: Accreditation Council for Graduate Medical Education; 2020. Available at: https://www.acgme.org/Specialties/Orthopaedic-Surgery/Overview.
[10] Dale JA, Schmitt CM, Crosby LA. Misrepresentation of research criteria by orthopaedic residency applicants. J Bone Joint Surg Am. 1999;81(12):1679-81.
[11] El Beaino M, Hagedorn JC, 2nd, Janney CF, Lindsey RW. Scientific publication misrepresentation among orthopaedic residency applicants. Am J Surg. 2019;218(2):436-9.[12] Baker DR, Jackson VP. Misrepresentation of publications by radiology residency applicants. Acad Radiol. 2000;7(9):727-9.
[13] Bilge A, Shugerman RP, Robertson WO. Misrepresentation of authorship by applicants to pediatrics training programs. Acad Med. 1998;73(5):532-3.
[14] Hsi RS, Hotaling JM, Moore TN, Joyner BD. Publication misrepresentation among urology residency applicants. World J Urol. 2013;31(3):697-702.
[15] Sater L, Schwartz JS, Coupland S, Young M, Nguyen LH. Nationwide study of publication misrepresentation in applicants to residency. Med Educ. 2015 Jun;49(6):601-11.
[16] Krueger CA, Hoffman JD, Balazs GC, Johnson AE, Potter BK, Belmont PJ Jr. Protected resident research time does not increase the quantity or quality of residency program research publications: a comparison of 3 orthopedic residencies. J Surg Educ. 2017;74(2):264-70.
Disclaimer: The views expressed in this article are those of the author(s) and do not necessarily reflect the official policy or position of the Department of the Navy, Department of Defense, or the United States Government. Todd Heig, Trevor Tompane, and Cory Janney are all military service members (or employees of the U.S. Government). This work was prepared as part of their official duties. Title 17, USC, §105 provides that ‘Copyright protection under this title is not available for any work of the U.S. Government.’ Title 17, §101 defines a U.S. Government work as a work prepared by a military service member or employee of the U.S. Government as part of that person’s official duties