E.U. Regulators Tackle Big Data, Definitions

Data may save lots of lives over the course of the 21st century, but the volume of data has created a number of headaches for regulators and life science companies alike. In response, two European regulatory entities have teamed up to take on these vital questions, and their report suggests it was no trivial matter just deciding upon a definition of the term “big data.”

The European Medicines Authority (EMA) and the Heads of Medicines Agency (HMA), the latter of which is a group of E.U. competent authorities, posted the report on the activities of the HMA/EMA Joint Big Data Task Force. The group formed six working groups to consider the data derived from a number of sources, such as genomics and clinical trial data, and set certain standards for regulatory acceptability of those data. Data standardization is a matter of some interest, as is the ages-old problem of data sharing.

The report states that there is an emphasis on providing some linkage between genomic data and clinical outcomes, but the authors say this will create pressure to provide timely updates of clinically relevant genomic information in medicinal product labels. There is also a keen interest, however, in defining performance standards for companion diagnostics, an ever-more pressing consideration as highly expensive cancer therapies continue to find their way to market.

The report includes information from a survey of stakeholders regarding their familiarity with big data, and it appears a number of the competent authorities in the survey enjoy “very limited expertise” on the subject. This lack of expertise is due to a perception that there is little need for it at present, but this dynamic is changing. Eight of the 24 competent authorities surveyed said they have no in-house expertise in biostatistics, but indicated they see a need for such expertise arising within the next five years; the study authors pointed out that such assets are already a necessity.

A total of 37 life sciences companies responded to the survey, with the group nearly evenly split between companies with more than 250 employees and those with 250 or fewer. This dividing line yielded quite different sets of considerations, with the larger companies expecting that big data will have the most significant impact on target identification and patient stratification, while their smaller counterparts emphasize outcome identification and patient-reported outcomes.

The two groups also had different concerns regarding data validity and the challenges associated with the use of big data, but shared concerns regarding regulatory harmonization and a need for regulatory guidance. The report concludes that the ability to manage big data will, in the future, be critical for the advance of regulatory science, but the authors also pointed to a need for a “systemic, coordinated, and integrated European approach” to such questions. While good intentions will prove vital to any such effort, the authors say that there is still a need to prioritize the various tasks of learning to manage and use big data, and that the current effort is moving along “in the right direction but not in a consistent and consolidated way.” The authors stated, “[W]e therefore need to guard against reverting to the status quo.” Comments on the joint task force report are due April 15.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s