The U.S. Food and Drug Administration (FDA) must be more consistent in its decision-making and standardize its device submission analyses to improve its long-criticized product review process, an independent report recommends.
The report, conducted by Tysons Corner, Va.-based consultants Booz Allen Hamilton, was ordered under terms of the Food and Drug Administration Safety and Innovation Act of 2012 (FDASIA), which reauthorized the Medical Device User Fee Act (MDUFA). One section of MDUFA -- "Independent Assessment of Review Process Management" -- required a two-phase assessment of the FDA's device submission review process. The first phase focuses on near-term steps the agency can take to improve device submission reviews and includes an evaluation of the FDA's refuse to accept (RTA), substantive interaction (SI), interactive review (IR) and missed MDUFA decision (MDD) policies as well as training, retention and IT infrastructure. FDA executives promised to act quickly on several "high priority" recommendations to improve its regulatory processes.
Booz Allen Hamilton interviewed industry representatives and FDA officials to determine the areas in need of improvement, basing its suggestions on stakeholder feedback, outside literature and review data analyses. It came up with four recommendations for the FDA:
1.) Develop criteria and establish mechanisms to improve consistency in decision making throughout the review process.
A "recurring issue" identified during research for the report was the inconsistent nature of the FDA's review decisions, particularly the "lack of transparency" which contributed to a feeling that the Center for Devices and Radiological Health (CDRH) regulated within a black box. In addition, reviewers frequently reference new standards that, while applicable to new submissions, were not necessarily in effect at the time of the original submission.
"Development of tools, criteria and/or mechanisms for assessing and ensuring the consistency of review processes would help ameliorate this issue," Booz wrote. Reviewers might also explain to companies early on in the process which standards they intend to use during review, the report speculated.
2.) Provide mandatory full staff training for the three primary IT systems that support MDUFA III reviews.
Recent IT upgrades have led to confusion among reviewers regarding documentation (which documents to store, where to store them and the best ways to integrate work across systems). Not all reviewers are trained on the CDRH's Center Tracking System (CTS), its central document tracking tool for premarket submissions, Booz noted.
3.) Identify metrics and incorporate methods to better assess review process training satisfaction, learning, and staff behavior changes.
Reviewers often fail to fully understand the science behind the products they review and keep up with the latest scientific advancements. Reviewers also must be thoroughly versed in the FDA's regulations -- an inconsistent or incorrect application of review standards can slow down an application or, worse, improperly reject it, denying patients access to safe and effective therapy.
Booz's report recommended focusing heavily on training and education for reviewers, including an assessment of the tools it uses to educate its staff and whether those tools are the best fit for the purpose. Subsequent surveys and post-training courses could serve to ensure that staff members retain their knowledge, Booz wrote.
4.) Adopt a holistic, multi-pronged approach to address five quality component areas to standardize process lifecycle management activities and improve consistency of reviews.
In the longest section of the report, Booz made several major recommendations to improve the review process by making it more standardized.
For example, Booz recommended that the management of the review process be improved by formally documenting decisions to intervene in reviews and, if necessary, the decisions taken to bring a problem to resolution. This would further promote accountability, communication and proper follow-up, the report explained.
Staff also should be taught the best ways to use CDRH's numerous document control databases. Staff members currently use several quality control methods to track documents, which make it difficult to compile review statistics and find errors. Databases should be audited and a new standard implemented, Booz said.
Finally, Booz said CDRH should identify and develop internal metrics to continually assess the efficiency and efficacy of its review processes. Such metrics would enable the agency to assess instances when metrics are not being met earlier in the review process, and allow it to make process improvements in real time.
A more comprehensive assessment is set to be published in six months. FDA is also expected to unveil an implementation plan for the recommendations at that time.