In the first big research scandal of the COVID-19 era, The Lancet and The New England Journal of Medicine (NEJM) today retracted two high-profile papers after a company declined to make the underlying data for both available for an independent audit, following questions being raised about the research. The Lancet paper, which claimed an antimalarial drug touted by President Donald Trump for treatment of COVID-19 could cause serious harm without helping patients, had had a global impact, halting trials of one of the drugs by the World Health Organization (WHO) and others.
Three authors on the Lancet paper requested the retraction, after initiating an independent review of the raw hospital patient data summarized and provided by Surgisphere, a small Chicago-based company operated by Sapan Desai, the fourth author of the study. Desai had previously said he and his co-authors —cardiac surgeon Mandeep Mehra of Harvard University and Brigham and Women's Hospital, Frank Ruschitzka of University Hospital Zürich, and Amit Patel, an adjunct faculty member at the University of Utah— were getting such an audit of the data, but the agreement apparently fell apart.
“Our independent peer reviewers informed us that Surgisphere would not transfer the full dataset, client contracts, and the full ISO audit report to their servers for analysis as such transfer would violate client agreements and confidentiality requirements,” making the outside audit of the data impossible, the three co-authors wrote in the retraction statement. “Based on this development, we can no longer vouch for the veracity of the primary data sources.”
The NEJM study that was retracted had concluded, based on Surgisphere-provided data from hospitals around the world, that taking certain blood pressure drugs, including angiotensin-converting enzyme (ACE) inhibitors, didn’t appear to increase the risk of death among COVID-19 patients, as some researchers had suggested.
NEJM published only a short statement from the paper’s authors, which included Mehra, Patel, and Desai, as well as SreyRam Kuy of Baylor College of Medicine and Timothy Henry of Christ Hospital in Cincinnati. “Because all the authors were not granted access to the raw data and the raw data could not be made available to a third-party auditor, we are unable to validate the primary data sources underlying our article,” they wrote, with apology. By including Desai, the note perplexingly suggests he has no access to the raw data generated by his own company.
A third study using Surgispgere data and co-authored by Mehra, Patel, and Desai, among others, was only posted online as a preprint. (It is no longer available.) It reported ivermectin, an antiparasitic drug, dramatically reduced mortality in COVID-19 patients, prompting increased use and government authorization of the drug in several Latin American countries.
The Lancet paper was what brought Surgisphere under scrutiny as it focused on the safety and effectiveness of the malaria drug hydroxychloroquine for COVID-19, which had already become a political and scientific controversy, in large part because of Trump’s embrace of the drug. As soon as the study was published, it came under attack by clinicians, as well as experts in biostatistics and medical ethics who questioned how Surgisphere, a tiny company without much publishing experience in big data analysis, could have collected and analyzed tens of thousands of patient records from hundreds of hospitals—particularly given the complexities of navigating patient confidentiality agreements.
Still, the Lancet study rattled scientists testing hydroxychloroquine in clinical trials because it suggested the drug dramatically increased the death rate of COVID-19 patients. Yesterday, after pausing the hydroxychloroquine arm of one such study in May because of the Lancet results, WHO resumed it. A panel reviewing preliminary data from the trial did not find any obvious evidence of harm to patients.
Desai, a vascular surgeon, entrepreneur, and science-fiction writer, declined requests for comment on the retractions. He said earlier that his artificial intelligence software was able to tease out reliable meaning from the multitude of disparate records.
In a personal statement, Mehra said he connected with Desai through a co-author, and that he personally reviewed the Surgisphere analyses for both the Lancet and NEJM papers. “When discrepancies in the data started to arise, I and the remaining co-authors immediately asked for a reanalysis from Surgisphere and then proactively contracted Medical Technology & Practice Patterns Institute to conduct an independent peer review,” Mehra said. But because Surgisphere would not transfer the primary data to the Bethesda, Maryland, institute, “I no longer have confidence in the origination and veracity of the data, nor the findings they have led to.”
Mehra conceded that in the rush to publish during the COVID-19 crisis, “I did not do enough to ensure that the data source was appropriate for this use. For that, and for all the disruptions—both directly and indirectly—I am truly sorry.”
Mehra is widely viewed as “one of the stars of the field,” says Daniel Goldstein, a cardiothoracic surgeon at the Albert Einstein College of Medicine who has collaborated with Mehra on several studies. “He is as straight an arrow as you can find,” Goldstein says. “I think he maybe was too trusting of this company, because [with] the amount of data that this database gave, it’s hard to believe someone would manipulate it.”
Patel and Ruschitzka did not respond to requests for comment.
Leigh Turner, a bioethicist at the University of Minnesota, Twin Cities, calls the retractions “unnerving and disturbing.” He said the Surgisphere case raises a bigger question about how much access to key data each author and each journal should require. “The less access they have, the greater the chances that there will be errors, data fabrication, or outright fraud.”
The retractions also show that outcry over the accuracy of a flood of COVID-19 preprints, which are not peer reviewed, is only one problem. A lack of rigor in the rush to publish has also reached “elite journals at the top of the academic pyramid.”
Turner said that by publishing only the author retraction statements, The Lancet and NEJM “didn’t show any self-reflection, any introspection. They should have looked at what might have gone wrong” in their own editorial process.
Calls for an outside look at the journals’ action have begun. In a statement, pyschologist Chis Chambers of Cardiff University, a member of the UK Reproducibility Network Steering Group, said: “It is right that these articles were retracted. However, the failure to resolve such basic concerns about the data during the course of normal peer review raises serious questions about the standard of editing at the Lancet and NEJM — ostensibly two of the world’s most prestigious medical journals. If these journals take issues of reproducibility and scientific integrity as seriously as they claim, then they should forthwith submit themselves and their internal review processes to an independent inquiry.”
Two simultaneous retractions in marquee medical journals may be jaw-dropping, but the disappearance of the preprint on ivermectin based on Surgisphere data deserves attention as well, says Carlos Chaccour of the Barcelona Institute for Global Health, who, along with two colleagues, raised questions about those data in a 29 May blog post. (Chaccour has archived the now-vanished manuscript, posted on preprint server SSRN on 19 April, on his institute’s website, along with an earlier version from the same authors that was posted on 6 April.)
The preprint helped make the drug all the rage in South America. Just last week, a doctor cited the study and its reported dramatic effect on mortality in a news show in Chile. “The ivermectin story has gone almost unnoticed. There’s no retraction letter—it was never published,” Chaccour says. “But its ghost lives on in Latin America.”
For Steven Tong, an infectious disease physician at the Doherty Institute in Melbourne, Australia, and an investigator on a hydroxychloroquine trial—the AustralaSian COVID-19 Trial (ASCOT)—which paused last week in response to the Lancet results, the retractions have produced “a mix of frustration and anger … [and] a feeling that our system in research has let us all down, from the authors of the papers, obviously, thorough to the peer reviewers and up to the journal editors. They’ve all done a great disservice to the research world.”
ASCOT, which hadn’t started recruiting patients yet, announced today it would start back up. The trial’s institutional review board has requested the investigators add a statement to the patient consent form describing the recent issues around the publication and retraction. “I think that’s a very fair request, and probably something that patients will ask,” Tong says.
With reporting by Martin Enserink
doi:10.1126/science.abd1697
Sourced from Science on 20200605 0100EDT. Byline:
Charles Piller, Kelly Servick | Jun. 4, 2020 , 5:30 PM
Comments