Time From Submission of Johns Hopkins University Trial Results to Posting on ClinicalTrials.gov
Posted by: Crystal Williams on: October 31, 2019 | Print This Page
The US Food and Drug Administration Amendments Act of 2007 (FDAAA)1 requires that applicable clinical trials (ACTs) submit results to ClinicalTrials.gov within 1 year of completion. ClinicalTrials.gov identifies trials that likely meet this definition as probable ACTs (pACTs). A complementary National Institutes of Health (NIH) policy requires that nonapplicable clinical trials (non-ACTs) funded by grants submitted to NIH from January 18, 2017, also submit results.2
After investigators submit results, the National Library of Medicine reviews their quality and may request changes by sending “comments” to investigators. Whether the NIH comments on records, the NIH must post results within 30 calendar days of first submission; the FDAAA makes no exemption for quality review time, and the NIH has not implemented a process for posting records that do not pass review.1,3 We evaluated the time between submitting and public posting of results.
We examined all records in the “JohnsHopkinsU” ClinicalTrials.gov account, with results submitted from January 1, 2017, to December 31, 2018. JohnsHopkinsU includes studies conducted by the Johns Hopkins University (JHU) School of Medicine and School of Nursing. Other parts of JHU, including Bloomberg School of Public Health and Sidney Kimmel Comprehensive Cancer Center, have separate ClinicalTrials.gov accounts. Records are categorized as ACTs, pACTs, or non-ACTs. We combined the ACTs with the pACTs in our analysis. We determined the number of submission cycles and the number of days between submission and public posting, including the number of days under review by NIH and JHU. The first submission cycle began the day investigators submitted results and concluded when results were either posted by NIH or the record was returned to JHU with comments. Records publicly posted by the NIH to ClinicalTrials.gov by January 7, 2019, were included in the analysis. We used information available publicly and only to JHU, with data and code available at https://osf.io/9paqz/. Analyses were performed using Stata, version 15.1 (StataCorp) and were repeated stratified by record type, pACT (including ACTs) vs non-ACT, because non-ACT investigators and the NIH might give lower priority to voluntary submissions compared with required submissions. Because the analysis did not involve human subjects, there was no institutional review board submission, and patient consent was not obtained.
Of 121 records submitted, we analyzed 115 records that posted results by January 7, 2019, including 97 of 115 pACTs (84%) and 18 of 115 non-ACTs (16%) (Table). Johns Hopkins University submitted records 1 to 5 times (mean [SD] = 2.25 [0.79]). The NIH sent first comments a mean (SD) of 32.68 (6.97) days after submission for pACTs and 63.19 (52.99) days after submission for non-ACTs. On average, pACTs were posted 76.23 (39.53) days after first submission; non-ACTs were posted after 162 (139.85) days. The NIH posted 7 of 97 pACTs (7%) within 30 days.
The NIH took more than 30 days to comment on most clinical trial results submitted by JHU in 2017 and 2018. Furthermore, multiple submission cycles delayed posting of results (Figure). Although NIH quality review may ensure that clinical trial results are understandable and accurate, federal law requires that the NIH post results within 30 days without exception for quality review. For 93% of JHU records, the NIH exceeded the 30-day statutory limit for posting. The NIH might be able to post records faster as organizations that conduct clinical trials learn to identify errors and inconsistencies before submitting their results.4 Several steps could be taken to ensure both that results are accurate and that they are posted in a timely manner. First, the NIH could summarize the types of comments that lead to multiple submissions, and the NIH could publish information about the duration of quality review, as we have done in this report. Sharing this information could help organizations that conduct clinical trials improve reporting. Second, academic organizations could support investigators by updating their policies and procedures and by hiring staff to support trial registration and reporting.5 Administrators who interact with ClinicalTrials.gov regularly could support investigators who interact with ClinicalTrials.gov infrequently. Finally, the NIH could further improve automatic checking and reduce the time to review submitted records.
Jama Internal Medicine https://jamanetwork.com/journals/jamainternalmedicine/fullarticle/2753423
Funding: Supported by the Johns Hopkins Institute for Clinical and Translational Research, which is funded in part by grant number UL1 TR001079. Dr Mayo-Wilson was also supported by the Johns Hopkins Institute for Clinical and Translational Research and Johns Hopkins Center for Excellence in Regulatory Science and Innovation grant from the US Food and Drug Administration (U01 FD004977-01; Caleb Alexander; Principal Investigator).