LOS ANGELES -- Internet sites that compare surgical care at various hospitals were often inaccessible and frequently displayed inconsistent results, inappropriate quality measures, and out-dated information, researchers reported.
LOS ANGELES, Sept. 19 -- Internet websites that compare surgical care at various hospitals were often inaccessible and frequently displayed inconsistent results, inappropriate quality measures, and out-dated information, researchers reported.
A review of six sites in September 2006 found that none of them provided real-time data and that information was more than a year old, and sometimes two years old, reported Michael J. Leonardi, M.D., of the University of California, Los Angeles, and colleagues in the September Archives of Surgery.
Because surgical patients often have time to do research before an elective procedure, information about hospitals' track records can be an important tool, but there is little in the literature that examines these websites and their contents, Dr. Leonardi said.
He and his colleagues analyzed the information available at the Centers for Medicare and Medicaid Services' site Hospital Compare, the Joint Commission on Accreditation of Healthcare Organizations' site Quality Check, the Leapfrog Group's site Hospital Quality and Safety Survey Results, and three proprietary sites (names withheld).
Three criteria were used for website accessibility: cost, need for sign-up, and ease of Website identification.
The government's Hospital Compare and JCAHO's Quality Check were rated as the most accessible overall. Neither required users to register or log in, and both appeared high up in Internet search results. The Leapfrog group's site, although free, was not as visible in search. The proprietary sites were much less easy to find and two required a modest annual fee.
When the researchers looked at whether sites listed the sources of their data and the statistical methods used to determine hospital rankings, the government and non-profit groups were again best. And three allowed users to compare hospitals for multiple common operations.
For appropriateness of hospital comparisons (use of antibiotics, for example), the proprietary websites were more complete. They compared multiple surgical procedures using a combination of process (infection-prevention measures, for instance), structure (volume and cost, for example), and outcome measures. However, criteria were not applied consistently among the hospitals.
Of the proprietary sites, two allowed patients to choose ranking criteria, the researchers noted.
To determine consistency among the websites, sample searches were done for the three proprietary sites comparing three common procedures (laparoscopic cholecystectomy, hernia repair, and colectomy) at four Los Angeles-area hospitals.
Inconsistencies were significant. For example, for colectomy, one hospital was ranked best by two sites, but worst by the other site, and the hospital ranked worst on that site was best for hernia repair on another site.
The proprietary nature of the sites could account for some of the concerns, but some are because of limitations in overall data quality, the researchers said.
Financial motivation is also a concern. None of the websites require payment from hospitals or surgeons to be included in their comparisons, but as the sites gain influence in the health care market, this is a potential conflict of interest, Dr. Leonardi said.
Further work is needed to improve these issues, particularly accessibility, the quality of data reporting, the statistical methods used to create rankings, and the criteria by which hospitals and specific operations are compared, the researchers said.
"It is probably important that surgeons be involved with the development of such reporting websites so that the comparisons accurately and appropriately reflect the quality of surgical care," they emphasized.
In an accompanying discussion, John Hunter, M.D., of Oregon Health & Sciences University in Portland, said it appears that the public websites have little of the data the public wants and the proprietary sites have lots of the data the public wants, but it is inconsistently reliable and "you gotta pay to get it."
Dr. Hunter questioned the results of a "two-year-old study," however, and asked "how do we get better data for our patients?"
He also called provider-specific data a potential slippery slope, inasmuch as, for example, one might expect high mortality rates from a master surgeon who takes on difficult cases.
To this, the researchers answered that, although the internet information is not perfect, it is at least a start, on which further work can be done.
Another commenter, Jeffrey Pearl, M.D., of the University of California, San Francisco, raised questions about the coding used to describe conditions. It would help, he said, if surgeons learned the coding language so that they could better describe their patients' issues.
Finally, Thomas R. Russell, M.D., executive director of the American College of Surgeons, said that if physicians don't influence the area of quality improvement and patient safety, "the MBAs will do it. We need to position ourselves as drivers of this quality improvement movement," he advised.