n pointed remarks, Germany’s cost-effectiveness watchdog has criticized an effort by European regulators to accelerate approval for new medicines based on limited evidence. And the concerns raised by the agency come as regulators on both side of the Atlantic increasingly look to such approaches to get new drugs to patients with unmet medical needs.
At issue is a proposal called adaptive pathways, a term used to describe a method for jumpstarting drug approvals for select patient populations. Two years ago, the European Medicines Agency launched a specific pilot program with plans to compare initial data used for approval with so-called “real world” data, which is subsequently gathered after the medicines are in use.
But after the EMA released a final progress report late last month, the German Institute for Quality and Efficiency in Health Care last week turned around and raised serious concerns about the effort. In short, the German watchdog agency maintained the EMA failed to make its case that this approach for approving drugs can make a demonstrable difference.
“Neither industry nor EMA has a concept as to how real-world data can be used after drug approval to allow drawing reliable conclusions on benefit and harm,” the agency wrote in a statement. Moreover, “a critical discussion on the quality, potential for bias, and reliability of the data acquired” following regulatory approval “was lacking.”
In particular, the German agency chastised the EMA for pilot programs that were “vague” about collecting real-world data to supplement existing clinical trial results. And the IQWIG, as the agency is called, also complained that there was “insufficient detail” in the proposals submitted by companies to bolster and refine safety and effectiveness information.
IQWIG also complained that the EMA “justifies this information gap” by pointing to a need to keep certain company data confidential. This has been a contentious point between the EMA and drug makers. After protracted debate, the European regulator last March released new rules that attempt to create limits for companies that seek to redact trial data.
The Germany agency, however, was having none of it. “In view of the importance of the pilot project for drug development and the potential consequences of the considerable changes in approval procedures for patients,” IQWIG wrote, “concealment of the content and results of the discussions seems unacceptable.”
Finally, IQWIG concluded by criticizing the EMA for failing to offer proposals for using real-world trial data after approvals, notably the potential for improved value. “If this is still lacking, then it would be high time to pause for a moment and rethink the whole concept, instead of considering more drugs in the consultations on adaptive pathways, as planned by EMA,” the Germany agency suggested.
We asked the EMA for a response to the criticism and will update you accordingly.
It is worth noting the EMA did acknowledge in its report that adaptive pathways is not a suitable approach for developing all medicines. The regulator conceded most companies were “not ready to describe real-world data plans” or strategies because they did not yet know whether their product will prove effective. And the EMA also lamented the fact that it received few proposals about value that were thought out.
[UPDATE: An EMA spokeswoman later wrote us that “we reject IQWIG’s conclusion about the limitations of real-world data.” The report states that “‘the majority of the plans were vague in terms of the purpose of collection of real-world data..’ This is not a judgement of the usefulness, in general, of collecting real-world data to assess a drug’s performance, only of the post-licensing evidence generation plan submitted by some companies. The purpose of repeat discussions among stakeholders is to refine and clarify real-world data collection plans. In several instances, EMA advised companies on what kind of real-world data and methodologies of analysis would be expected.”
She added that “methodological challenges to the use of real-world data are recognized in the report and the pilot was intended to be a learning exercise of what was feasible within the current regulatory framework, not the definition of a new regulatory standard.” A workshop is to be held in December “to further discuss best practice examples and methodological challenges.”]