NEW HAVEN, Conn. -- Most internal medicine residents had low scores in a test of the biostatistics needed to interpret published clinical research, investigators here found.
NEW HAVEN, Conn., Sept. 5-Most internal medicine residents had low scores in a test of the biostatistics needed to interpret published clinical research, investigators here found.
The overall scores for residents was 41.4% versus 71.5% for fellows and general medicine faculty with research training, Donna M. Windish, M.D., of Yale, and colleagues reported in the Sept. 5 issue of the Journal of the American Medical Association.
"Physicians must keep current with clinical information to practice evidence-based medicine," Dr. Windish wrote. However, little is known about residents' ability to understand statistical methods or to interpret research outcomes, she said.
To gain insight, the researchers did a multiprogram cross-sectional survey of 277 internal medicine residents in 11 residency programs in Connecticut. To provide data for validity testing, 10 faculty members and fellows trained in clinical investigations also completed the survey.
The residency programs included seven traditional programs, two primary care medicine programs, one medicine/pediatrics program, and one medicine/preventive medicine program. Seven programs were university based; four were community-based.
The review articles on which the multiple-choice questions in the study were based came from six journals: American Journal of Medicine, Annals of Internal Medicine, BMJ, JAMA, The Lancet, and the New England Journal of Medicine.
The overall mean percentage correct on statistical knowledge and interpretation of results was 41.4% (95% confidence interval [CI], 39.7% - 43.3%) versus 71.5% (CI, 57.5% - 85.5%) for fellows and general medicine faculty with research training (P<0.001).
Higher scores for residents were associated with:
On individual knowledge questions, 81.6% of residents correctly interpreted a relative risk, the researchers reported.
However, the residents were less likely to know how to interpret an adjusted odds ratio from a multivariate regression analysis (37.4%) or the results of a Kaplan-Meier analysis (10.5%).
Only 58.8% (CI, 53.0% - 64.6%) could interpret the meaning of a P value.
Seventy-five percent of the residents reported that they did not understand all of the statistics they encountered. This lack of confidence was validated by their low knowledge scores, in which, on average, only eight of 20 questions were answered correctly, the researchers said.
However, 95% agreed or strongly agreed that to be an intelligent reader of the literature it is necessary to know something about statistics, and 77% indicated that they would like to learn more about statistics.
Poor knowledge in biostatistics and interpretation of study results among residents in this study probably results from insufficient training, the researchers said.
Nearly a third of the residents indicated they had never been taught biostatistics at any point in their career. When training did occur, about 70% said it occurred during college or medical school and was not reinforced in residency.
Although male sex was associated with better scores, this finding is not supported by other literature and should be interpreted with caution, the researchers said.
Because the survey was brief, the researchers acknowledeged, the study was limited in its ability to assess the residents' understanding of all biostatistical concepts and research results.
Also, the survey included only those residents present at the time of the inpatient conferences when the test was given. Residents who did not attend, either by choice or chance, might have scored differently.
In addition, the study was limited to internal medicine residents, limiting it generalizability to other physicians in other specialties.
Most residents in this study lacked the knowledge in biostatistics needed to interpret many of the results published in clinical journals, the researchers said.
"If physicians cannot detect appropriate statistical analyses and accurately understand their results, the risk of incorrect interpretation may lead to erroneous applications of clinical research," Dr. Windish said.
"Educators should reevaluate how this information is taught and reinforced in order to adequately prepare trainees for lifelong learning, and further research should examine the effectiveness of specific educational interventions," the investigators concluded.