In a business niche as highly critical, closely scrutinised and heavily regulated as pharmacovigilance it remains a puzzle why clinical development organisations struggle so often with the “end of process task” of distributing Safety Reports*. These are commonly known as Suspected Unexpected Serious Adverse Reaction or SUSARs. Timely safety report distribution is important because of its part in improving patient safety but also due to the compelling regulatory obligation to distribute this material promptly.
For reference here is how the FDA explain things – p16 of FDA Guidance Safety Reporting Requirements for INDs and BA/BE Studies.
The reality is that this end of process task sounds simple to define but it is in fact pretty complex and time critical. In addition – organisations are often so blindly addicted to using email that alternative, more dynamic and reliable means of information distribution are strangely alien and fantastical. But why not use distribution means that allow for active information distribution, with real-time dynamic readership information? In an age where audience behaviour is closely tracked, isn’t it time you knew who has read your latest Safety Report, and more crucially who has not?
While good documentation practices have always been central to GCP and good clinical operations, the emergence of distinct eTMF systems and the related TMF teams, has meant that TMF management itself has to an extent been allowed to morph into something large and semi-detached from trial execution. This has never been logical or a particularly desirable development.
In fact it is arguable that this runs counter to GCP. The ICH E6 R2 requirement for continuous and contemporaneous record keeping is hard, if not impossible, to comply with when eTMF systems are kept away from frontline clinical trial staff and accessed only by specialist TMF staff. Remember that the TMF is meant to allow you to retell “the story of your study.”
What seems to have happened is that these “traditional” eTMF tools are so complicated to use, or have so many features, that sponsors judge it is better to have dedicated TMF teams shielding the rest of the world from the eTMF tool. Likewise as TMF specialists finally knock a study TMF into presentable shape there is understandable cultural resistance to letting all and sundry back in just to mess it all up again. While we can be sympathetic to both these issues they are a case of the tail wagging the dog – and they run counter to the continuous and contemporaneous requirement, as well as impeding other aspects of GCP and sensible TMF management.
The reality is the clinical trials do get messy and complicated. Documents have multiple versions. They get revised multiple times as the study progresses and issues arise – and sometimes multiple versions can be deployed in parallel – e.g. protocol amendments for different countries get approved at different times. Everyone knows this, especially inspectors.
There is an equally compelling requirement on sponsors to ensure study staff have access to up to date documentation at all times. How to do that in a traditional TMF model where access to the TMF is not provided for example to sites? Nearly impossible – and in practice people find themselves storing unauthorised copies of documents just to get on with things – clearly an unacceptable reality.
So what are the alternatives?
TMF management practice that embedded into clinical operations practice – documents are born, live and evolve in a controlled environment without getting lost, while being automatically captured as part of “the story of your study.”
The electronic Trial Master File (eTMF) landscape has evolved beyond traditional, large-scale, enterprise eTMF deployments designed for companies with large quantities of studies (and staff!). Emerging biopharma companies and new CROs with focused pipelines now need and expect flexible, powerful TMF solutions as well – but without the army of TMF support staff that large pharma all too often deploy in the background.
This evolution has been driven by:
While there remains a role for traditional eTMF repositories as a final study document archive, user and regulatory expectation has moved beyond such simple document repositories accessed only by a select inner-circle of study staff. Access to the single source of truth now needs to be study wide.
Everyone expects ready access to the latest versions study materials that are relevant to them. Staff at sites and in distributed study teams are familiar with using file sharing services for all sorts of digital material in both their professional and personal lives. They expect similar capabilities within the regulated context of their clinical trials – from their desktop PCs to tablets and phones.
At the same time, regulatory scrutiny has evolved. Inspectors now expect to see a steady flow over time of study documents into the TMF. They know, and expect to see, new versions emerge when necessary and proof of distribution of those new versions to the correct audience is expected. No longer is it acceptable to upload a bolus of study material into the TMF just before an inspection. Inspectors also expect to see study decisions, such as the study design and choice of suppliers documented at the time, e.g. the RFPs, supplier proposals and notes from bid defense discussions need to be in the TMF.
Achieving a steady flow of content into your TMF using just a small, central team of TMF administrators is close to impossible. Delays and bottlenecks occur as the flow of documents spikes from day to day. Delays in document uploads lead to GCP violations and users are tempted to keep local copies of potentially outdated materials simply to ensure they can do their job.
Traditional eTMF deployments often fragment the associated tasks of creating and managing documents. Meaning the person creating a document might not be able to actually file the document into the TMF. Or add the appropriate TMF codes. Why this fragmentation and inefficiency is tolerated is a puzzle.
Another puzzling aspect of traditional TMF management is the delegation of TMF coding to junior staff – who can be the least qualified to understand the true content and implications of a document. What price the misfiling of documents during an inspection?
Instead of needing multiple resources to:
All three of these can be done at the same time. The document author can create, share (distribute) and file (code) a document in real time – with distribution to readers being automatically tracked by your TMF system.
This more practical and scalable approach for all document creators (including those at sites, suppliers, CROs and sponsors) is much more in keeping with the lean operations of many emerging biopharma companies. It also allows central TMF staff to become more like a librarians, ensuring items are correctly coded and indexed. This affords the new TMF Librarian more bandwidth to identify and pursue misclassified or missing content – a much more valuable contribution in preparation for future inspections.
By Adam Wood, VP, Business Development at myClin
About myClin: Are you really ready for an inspection? Start using the myClin platform to take control of intricate and error-prone study documentation. Keep essential information at your fingertips to stay audit-ready at all times. Get started with more free resources and a demo at myClin.com
About Clinical Works: Are you ready to move faster and smarter with a high impact, curated ClinOps team? We help new bio-pharm ventures and start-ups bridge the gap from investment to clinical development. Find out more at clinical.works