Skip to Main Content

In the classic 1967 film “The Graduate,” young Benjamin Braddock was given one word of advice for his future: plastics. In that spirit, I’d like to offer Dr. Ned Sharpless, the new acting commissioner of the Food and Drug Administration, two words of advice for the future of the FDA: data standards.

Like many other experts, I grow increasingly alarmed by the slow progress in the development, implementation, and acceptance of data standards for clinical trials. As the clock ticks, the capital costs and the human costs of inaction are mounting. No matter where my work takes me, I always return to the urgent need for data standards.

I know that “data standards” may not sound as enticing as “precision medicine.” Such standards aren’t the obvious fulcrum upon which the modernization of our health system pivots. But doctors, researchers, and other health care professionals in the U.S. and around the world struggle daily with non-standard, messy data, wasting their time and causing delays and affecting safety for patients.

advertisement

During his final days in office, former FDA Commissioner Scott Gottlieb issued a bold and aggressive statement calling for modernizing clinical trials. He cited business models that discourage data sharing as a big impediment to this modernization. While I generally agree with Gottlieb’s call to action, I think there’s a far more intractable barrier than business models that is a higher priority: the complete absence of top-level data standards for clinical trials. All the fancy software in the world won’t make it possible to smoothly move data from clinical systems into research platforms unless we commit ourselves to data standards.

The FDA requires pharmaceutical companies to submit data to it in a standardized format. We need to push that requirement upstream, so data will be collected in a standardized way and not merely converted and transformed after the fact. If Sharpless makes data standards a priority, he will not only capitalize on the progress of the previous commissioner but will be responsible for one of the most essential contributions to the advancement of medical research this century.

advertisement

I have spent years participating in formal and informal efforts to establish data standards. This work is also central to my role as director of the Center for Research Informatics at the University of Chicago, whose mission is to provide informatics resources and services within the university and with partners in the private sector. The broad reach of this organization and my own background as a pediatric oncologist provide me with a unique view into the state of data collection, processing, and management in many corners of health care and clinical trial systems.

How do we go about establishing those standards? And where do we start?

The FDA is well-positioned to lead the charge. As the primary regulatory body for the pharmaceutical industry, it not only sets the agenda but can spur immediate action. If Sharpless and the FDA were to take the following three actions, we could finally break out of the data standards standstill and improve the cost, speed, and fidelity of clinical trials.

First, the FDA should begin working to get agreement from the pharmaceutical industry to adopt a single data standard for generating protocols and creating budgets. Who can reasonably expect the industry to adopt artificial intelligence and machine learning when the majority of trial protocols are still written in Microsoft Word?

Data collected in inconsistent and non-standardized formats require manual conversion, adding to the costs of research and drug development. This need for data transformation also pulls resources away from higher-impact work. Any viable plan for data standards development and integration necessarily begins with leveraging those standards at the point of collection.

Second, the FDA must work to bridge the worlds of clinical care and clinical research. Most people are surprised to learn that data for patients in clinical trials must be manually extracted from the electronic health record and hand-entered into each pharmaceutical company’s customized and proprietary clinical trial system. The FDA should work with the pharmaceutical industry to create incentives for developing systems to move data smoothly from electronic health records into electronic data capture systems and right on through to the FDA. Gottlieb accurately called out the need for faster, more accurate data sharing throughout the industry, but he may have jumped a step ahead. Data sharing and portability must be the second step — they can’t truly be accomplished without standards for data collection.

The Health Level 7 Fast Healthcare Interoperability Resource (usually referred to as HL7 FHIR) is the most well-known international data standard for electronically exchanging health care information. Now widely used throughout the industry, the latest FHIR release will help ensure compatibility with older versions. But this standard mainly addresses interoperability on the consumer-facing side of the industry, such as portals through which patients access their records or a provider in one system passing information to a provider in another system.

It does little to create a closed loop of interoperability between pharmaceutical companies, research organizations, and the FDA. The FDA has an opportunity to promote the use of FHIR for the electronic transfer of data from electronic health record systems into clinical trial data capture systems. This will not only promote faster and cheaper clinical trials but will give the industry and the FDA better insights into how drugs affect patients by creating more efficient and accurate ways to report on patient side effects and adverse outcomes.

Third, the FDA must work to require electronic health record vendors to adopt standardized data that align with the FDA’s own requirements for how the industry submits its information. For decades, electronic health records systems have created interoperability minefields that not only create risk for patients but also stymie progress in the research community. While the industry has tried to address these problems on its own, the landscape is filled with bespoke systems that encourage the continued siloing of information.

In an ecosystem as large and complex as U.S. health care, data standards are certainly not a panacea for all the challenges we face. But they must come first, because any health system with this many moving parts ultimately depends on standards to ensure that every entity in the value chain can accurately leverage data to improve outcomes for patients.

That’s why I believe that data standards should be the top priority of Sharpless and the entire FDA. If we can make progress in this area, we can make transformative progress virtually everywhere else.

Sam Volchenboum, M.D., is the director of the Center for Research Informatics at the University of Chicago, a pediatric hematologist and oncologist, and the co-founder of Litmus Health, a data science platform for early-stage clinical trials.

  • Not just data standards, but measurement standards. Statistical analyses of constantly changing data are far less capable of supporting innovation ecosystems effectively than networks of instruments calibrated to unit standards explained by predictive theory. Work in this area has been ongoing for decades and it is way past time the results were put to work. For a recent paper taking up the theme, see Cano et al (2019) in the International Measurement Confederation’s (IMEKO) Measurement journal at https://doi.org/10.1016/j.measurement.2019.03.056.

  • Sam – We have a timely opportunity to address the need for standards and implementation specifications in this realm at our ONC Interoperability Standards Priorities Task Force. Please contact me so we can discuss. @emrdoc1

Comments are closed.