Join us at SCDM 2018 for a session chaired by Sara Doolittle, Associate Director of Data Operations
While quality assurance and data operations are generally considered separate entities in clinical drug research, in theory and in practice they are intertwined at multiple levels involving numerous stakeholders. Sara Doolittle, Premier Research’s Associate Director of Data Operations, will explore the relationship between data and quality management at the Society for Clinical Data Management’s Annual Conference in Seattle.
Join Sara as she chairs Let’s DTQR: Define the Quality Relationship of Data Management Within Clinical Trials. The session,which also features Melonie Longan, Premier Research’s Director of Data Operations, begins at 1:45pm September 24.
Defining quality’s role in the data management framework
While we have all participated in audits and/or contributed to corrective and preventive actions (CAPAs), the less tangible manifestations of quality within the data management framework are also worth exploring. Such manifestations can be found in the processes governing and ensuring data integrity, data management outsourcing models and governance structures, and the systems in place to collect and process patient data.
Data underpins every aspect of drug development, and maintaining integrity from source collection to electronic data collection to analysis requires that the information be protected from accidental or intentional modifications, duplication, deletion, and falsification.
Meeting current requirements for data quality
The Food and Drug Administration (FDA) maintains expectations that data acquired throughout a clinical study follow the acronym ALCOA: Attributable, Legible, Contemporaneous, Original, and Accurate. Though the requirement was developed in the days of paper-based studies, it’s just as relevant today and exemplifies the importance of quality in all aspects of data management.
Given the multitude of vendors, systems, standard operating procedures, lab equipment, and other avenues of data movement, maintaining integrity is no easy task and requires an intentional focus on quality. Some larger sponsors have implemented a Functional Services Provider, or FSP, model and/or engaged vendor oversight practices to standardize processes and systems across vendors to streamline data control.
FSP in action
For example, a big Fortune 100 pharma company approached us, looking to free itself from day-to-day administration of its data operations. The arrangement quickly expanded to include related technical and programming functions, including full operation of the company’s clinical data management system, help desk, and global support ticketing system.
We ultimately took on more than 20 unique services, allowing the client to significantly cut its transactional costs while retaining standardized processes and maintaining operational control.
We currently have more than a dozen FSP relationships, and the services take many forms, from FSP agreements to server hosting, from siting data operations in cost-friendly Central and Eastern Europe to hybrid approaches that combine on-site resources, off-site resources, and global sourcing efficiency.
As we evaluate the various components within data management not often thought of as applicable to quality, it’s important to include an assessment of their impact on the quality of the data we manage. These include EDC systems and other data collection tools; standard operating procedures in place at sites, sponsors, CROs and other vendors; and outsourcing models and oversight.
Join us September 24 at SCDM for what’s sure to be an enlightening discussion. To schedule a personal meeting to discuss your data quality challenges, contact us today.