By Adam Wood, VP, Business Development at myClin
While good documentation practices have always been central to GCP and good clinical operations, the emergence of distinct eTMF systems and the related TMF teams, has meant that TMF management itself has to an extent been allowed to morph into something large and semi-detached from trial execution. This has never been logical or a particularly desirable development.
In fact it is arguable that this runs counter to GCP. The ICH E6 R2 requirement for continuous and contemporaneous record keeping is hard, if not impossible, to comply with when eTMF systems are kept away from frontline clinical trial staff and accessed only by specialist TMF staff. Remember that the TMF is meant to allow you to retell “the story of your study.”
What seems to have happened is that these “traditional” eTMF tools are so complicated to use, or have so many features, that sponsors judge it is better to have dedicated TMF teams shielding the rest of the world from the eTMF tool. Likewise as TMF specialists finally knock a study TMF into presentable shape there is understandable cultural resistance to letting all and sundry back in just to mess it all up again. While we can be sympathetic to both these issues they are a case of the tail wagging the dog – and they run counter to the continuous and contemporaneous requirement, as well as impeding other aspects of GCP and sensible TMF management.
The reality is the clinical trials do get messy and complicated. Documents have multiple versions. They get revised multiple times as the study progresses and issues arise – and sometimes multiple versions can be deployed in parallel – e.g. protocol amendments for different countries get approved at different times. Everyone knows this, especially inspectors.
There is an equally compelling requirement on sponsors to ensure study staff have access to up to date documentation at all times. How to do that in a traditional TMF model where access to the TMF is not provided for example to sites? Nearly impossible – and in practice people find themselves storing unauthorised copies of documents just to get on with things – clearly an unacceptable reality.
So what are the alternatives?
- A collaborative, living TMF
- Single source of truth
- Document distribution that is proven and defendable
TMF management practice that embedded into clinical operations practice – documents are born, live and evolve in a controlled environment without getting lost, while being automatically captured as part of “the story of your study.”
Evolving TMF System Landscape
The electronic Trial Master File (eTMF) landscape has evolved beyond traditional, large-scale, enterprise eTMF deployments designed for companies with large quantities of studies (and staff!). Emerging biopharma companies and new CROs with focused pipelines now need and expect flexible, powerful TMF solutions as well – but without the army of TMF support staff that large pharma all too often deploy in the background.
This evolution has been driven by:
- Regulatory agency expectations and ICH E6 R2 requirements for “continuous and contemporaneous” record keeping and direct access to systems during inspections.
- The arrival of mainstream cloud-based storage and sharing services at both consumer and business-to-business level e.g. Google Drive, Sharepoint, OneDrive, DropBox, Box etc.
- Demand for “living TMF” deployments where both final and in-process documents can be accessed securely.
While there remains a role for traditional eTMF repositories as a final study document archive, user and regulatory expectation has moved beyond such simple document repositories accessed only by a select inner-circle of study staff. Access to the single source of truth now needs to be study wide.
Everyone expects ready access to the latest versions study materials that are relevant to them. Staff at sites and in distributed study teams are familiar with using file sharing services for all sorts of digital material in both their professional and personal lives. They expect similar capabilities within the regulated context of their clinical trials – from their desktop PCs to tablets and phones.
At the same time, regulatory scrutiny has evolved. Inspectors now expect to see a steady flow over time of study documents into the TMF. They know, and expect to see, new versions emerge when necessary and proof of distribution of those new versions to the correct audience is expected. No longer is it acceptable to upload a bolus of study material into the TMF just before an inspection. Inspectors also expect to see study decisions, such as the study design and choice of suppliers documented at the time, e.g. the RFPs, supplier proposals and notes from bid defense discussions need to be in the TMF.
In-Line Processing is More Efficient
Achieving a steady flow of content into your TMF using just a small, central team of TMF administrators is close to impossible. Delays and bottlenecks occur as the flow of documents spikes from day to day. Delays in document uploads lead to GCP violations and users are tempted to keep local copies of potentially outdated materials simply to ensure they can do their job.
Traditional eTMF deployments often fragment the associated tasks of creating and managing documents. Meaning the person creating a document might not be able to actually file the document into the TMF. Or add the appropriate TMF codes. Why this fragmentation and inefficiency is tolerated is a puzzle.
Another puzzling aspect of traditional TMF management is the delegation of TMF coding to junior staff – who can be the least qualified to understand the true content and implications of a document. What price the misfiling of documents during an inspection?
Instead of needing multiple resources to:
- Create a document
- Distribute a document
- File a document
All three of these can be done at the same time. The document author can create, share (distribute) and file (code) a document in real time – with distribution to readers being automatically tracked by your TMF system.
This more practical and scalable approach for all document creators (including those at sites, suppliers, CROs and sponsors) is much more in keeping with the lean operations of many emerging biopharma companies. It also allows central TMF staff to become more like a librarians, ensuring items are correctly coded and indexed. This affords the new TMF Librarian more bandwidth to identify and pursue misclassified or missing content – a much more valuable contribution in preparation for future inspections.
Join us Thursday, September 5th for our next webinar, Start Digitalizing, Stop Digitizing your Study Oversight, and learn how to leverage technology to revamp a clinical trial conduct model and invoke value-producing capabilities. Registration is now open here!