Category Archives: Data Validation

Clinical Trials – Computerized Systems

The Food and Drug Administration (FDA) established the Bioresearch Monitoring (BIMO) program of inspections and audits to monitor the conduct and reporting of clinical trials to ensure that data from these clinical trials meet the highest standards of quality and integrity and conform to FDA’s regulations.

Computerized systems used in clinical trials refer to the creation, modification, maintenance, archiving, retrieving or transmitting clinical data intended for submission to the Food and Drug Administration (FDA).

Key Definitions:

Audit Trail: a secure, computer-generated, time-stamped electronic record that allows reconstructions of the data course of events relating to the creation, modification, and deletion of an electronic record.

Certified Copy: it is a copy of the original information that has been verified, as an exact copy having all of the same attributes and information as the original. It must have a dated signature.

Computerized System: it is the computer hardware, software, and associated documents (i.e manuals) that create, modify, maintain, archive, retrieve, or transmit in digital form information related to the conduct of a clinical trial.

Electronic Case Report Form (e-CRF): designed to record information required by the clinical trial protocol to be reported to the sponsor on each trial subject.

Electronic Patient Diary: an electronic record into which a subject participating in a clinical trial directly entrees observations or directly responds to an evaluation checklist or questionnaire

Electronic Record: a combination of text, graphics, data, audio, pictorial, or any other information representation in digital form that is created, modified, maintained, archived, retrieved or distributed by a computer system.

Electronic Signature: a computer data compilation of any symbol or series of symbols, executed, adopted, or authorized by an individual to be legally binding equivalent of the individual’s handwritten signature.

Software Validation: verification and validation is the process of checking that a software system meets specifications and that it fulfills its intended purpose. For these guidelines, the design level validation is that portion of the software validation that takes place in parts of the software life cycle before the software is delivered to the end-user.

Source Documents: original documents and records including, but not limited to, hospital records, clinical and office charts, laboratory notes, memoranda, subjects’ diaries or evaluation checklists, pharmacy dispensing records, recorded data from automated instruments,copies or transcriptions certified after verification as being accurate and complete, microfiches,photographic negatives, microfilm or magnetic media, x-rays, subject files, and records kept at the pharmacy, at the laboratories, and at medico-technical departments involved in the clinical trial.

Principles:

Security measures should be in place to prevent unauthorized access to the data and to the computerized system.

1-Identify at which steps a computerized system will be used to create, modify, maintain, archive, retrieve, or transmit data.

2-Documentation should identify what software and, if known, what hardware is to be used in computerized systems that create, modify, maintain, archive, retrieve, or transmit data. This document should be retained as part of the study records.

3-Source documents should be retained to enable reconstruction and evaluation of the trial.

4-When original observations are entered directly into a computerized system, the electronic record is the source document.

5-The design of a computerized system should ensure that all applicable regulatory requirements for recordkeeping and record retention in clinical trials are met with the same degree of confidence as is provided with paper systems.

6-Clinical investigators should retain either the original or a certified copy of all source documents sent to a sponsor or contract research organization, including query resolution correspondence.

7-Any change to a record required to be maintained should not obscure the original information. The record should clearly indicate that a change was made and clearly provide a means to locate and read the prior information.

8-Change to the data are stored on electronic media will always require an audit trail, in accordance with 21 CRF 11,.10(e). It should include who made the changes, when, and why they were made.

9-The FDA may inspect all records that are intended to support submissions to the Agency, regardless of how they were created or maintained.

10-Data should be retrievable in such a fashion that all information regarding each individual subject in a study is attributable to that subject.

11-Computerized systems should be designed so that all requirements assigned to these systems in a study protocol are satisfied and to preclude errors in data creation, modification, maintenance, archiving, retrieval or transmission.

As we read in this blog about guidance for the industry around computerized systems revolts around data quality and data integrity. The users or people using the data from these systems should have confidence that the data are no less reliable than data in paper form.

In the next blog, we will cover audits and inspections, data entry into this computerized system, security and electronic signatures as a way of certifying the data.

Source:

CFR 11 and ICH

FDA.com

 

 

 

Got Medrio? The Next Best EDC…

Medrio is a low cost solution that offers easy mid-study                changes and intuitive phase I workflows.

Medrio

One of my favorite features of Medrio is the Skip logic functionality. So what is Skip logic?

Let’s demonstrate this feature by using the Demography form / Race field:

In many EDC systems that I am currently using or used in the past, we have to create separate fields for each option and write a custom edit check to flag when data has been entered under the specify field. This scenario request data on the specify field if the OTHER race option is checked but with skip logic, no other option will be allowed to enter data (e.g. White or Black or Indian) if the user did not select OTHER as an option and the required field ‘Specify’ is made visible and available (mandatory) for data entry.

Medrio

eCRF – DEMO – Medrio

 

 

 

 

 

 

 

 

DM form – Skip Logic

 

 

 

 

 

 

 

In the above screenshot,  the query resulting from the skip logic configuration if OTHER specify is not completed. In other words, when Race other than ‘OTHER’ is checked, the specify field will be skipped (not enterable). To make this work and as a best practice, you will need to make the ‘OTHER’ field required during data entry.

If you are looking for a study builder or clinical programmer to support your clinical trials and data management department, please use the contact form.

Source: medrio.com

Disclaimer: The EDC Developer blog is “one man’s opinion”. Anything that is said on the report is either opinion, criticism, information or commentary, If making any type of investment or legal decision it would be wise to contact or consult a professional before making that decision.

-FAIR USE-
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching,SCHOLARSHIP, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

Freelancer / Consultant / EDC Developer / Clinical Programmer

* Setting up a project in EDC (Oracle InForm, Medidata Rave, OpenClinica, OCRDC)
* Creation of electronic case report forms (eCRFs)
* Validation of programs, edit checks
* Write validation test scripts
* Execute validation test scripts
* Write custom functions
* Implement study build best practices
* Knowledge of the process of clinical trials and the CDISC data structure

 

Understanding Audit Trail Requirements in Electronic GxP Systems

Computerized systems are used throughout the life sciences industry to support various regulated activities, which in turn generate many types of electronic records.  These electronic records must be maintained according to regulatory requirements contained within FDA’s 21 CFR Part 11 for US jurisdictions and Eudralex Volume 4 Annex 11 for EU jurisdictions.  Therefore, we must ensure the GxP system which maintains the electronic record(s) is capable of meeting these regulatory requirements.

What to look for in Audit Trail?

  • Is the audit trail activated? SOP?
  • Record of reviews? (most companies trust the electronic systems audit trail and generates electronic paper version of it without a full review)
  • How to prevent or detect any deletion or modification
    of audit trail data? Training of staff?
  • Filter of audit trail

Can you prove data manipulation did not occur?

Persons must still comply with all applicable predicate rule requirements related to documentation of, for example, date (e.g. 58.130(e)), time, or sequencing of events, as well as any requirements for ensuring that changes to records do not obscure previous entries.

Consideration should be given, based on a risk assessment, to building into the system the creation of a record of all GMP-relevant changes and deletions (a system generated “audit trail”).

Audit trail content:

Audit trail content and reason it is required:
Identification of the User making the entry This is needed to ensure traceability.  This could be a user’s unique ID, however there should be a way of correlating this ID to the person.
Date and Time Stamp This is a critical element in documenting a sequence of events and vital to establishing an electronic record’s trustworthiness and reliability.  It can also be effective deterrent to records falsification.
Link to Record This is needed to ensure traceability.  This could be the record’s unique ID.
Original Value  

This is needed in order to be able to have a complete history and to be able reconstruct the sequence of events

New Value
Reason for Change This is only required if stipulated by the regulations pertaining to the audit trailed record.  (See below)

FDA / Regulators findings and complaints during Inspection of Audit Trail Data:

  • Audit User sometimes is hard to describe (e.g. user123 instead use full names of each user IDs thus requirement additional mapping)
  • Field IDs or Variables names are used instead of SAS labels or Field Labels (map field names with respective field text (e.g.  AETERM displayed instead use Reported Term for the Adverse Event)
  • Default values should be easily explained or meaningful (see annotated CRF)
  • Limited access to audit trail files (many systems with different reporting tools or extraction tool. Data is not fully integrated. Too many files and cannot be easily integrated).
  • No audit trail review process. Be prepared to update SOPs or current working practices to add review time of audit trails. It is expected that at least, every 90 days, qualified staff performed a review of the audit trail for their trials. Proper documentation, filing and signature should be in place.
  • Avoid using Excel or CSV files. Auditors are now asking for SAS datasets of the audit trails. Auditors are getting trained to generate their own output based on pre-defined set of parameters to allow auditors to summarize data and produce graphs.
  • Formatting issues when exporting into Excel, for example.  Numbers and dates fields change it to text fields.
Audit Trail Review

What data must be “audit trailed”?

When it comes to determining on which data the audit trail must be applied, the regulatory agencies (i.e. FDA and EMA) recommend following a risk based approach.

Following a “risk based approach”

In 2003, the FDA issued recommendations for compliance with 21 CFR Part 11 in the “Guidance for Industry – Part 11, Electronic Records; Electronic Signatures — Scope and Application” (see reference: Ref. [04]).  This guidance narrowed the scope of 21 CFR Part 11 and identified portions of the regulations where the agency would apply enforcement discretion, including audit trails. The agency recommends considering the following when deciding whether to apply audit trails:

  • Need to comply with predicate rule requirements
  • Justified and documented risk assessment to determine the potential effect on product quality
  • product safety
  • record integrity

With respect to predicate rule requirements, the agency states, “Persons must still comply with all applicable predicate rule requirements related to documentation of, for example, date (e.g., § 58.130(e)), time, or sequencing of events, as well as any requirements for ensuring that changes to records do not obscure previous entries.”  In the docket concerning the 21 CFR Part 11 Final Rule, the FDA states, “in general, the kinds of operator actions that need to be covered by an audit trail are those important enough to memorialize in the electronic record itself.” These are actions which would typically be recorded in corresponding paper records according to existing recordkeeping requirements.

The European regulatory agency also recommends following a risk based approach.  The Eudralex Annex 11 regulations state, “consideration should be given, based on a risk assessment, to building into the system the creation of a record of all GMP-relevant changes and deletions (a system generated “audit trail”).”

MHRA Audit

When does the Audit Trail begin?

The question of when to begin capturing audit trail information comes up quite often, as audit trail initiation requirements differ for data and document records.

For data records:

If the data is recorded directly to electronic storage by a person, the audit trail begins the instant the data hits the durable media.  It should be noted, that the audit trail does not need to capture every keystroke that is made before the data is committed to permanent storage. This can be illustrated in the following example involving a system that manages information related to the manufacturing of active pharmaceutical ingredients.  If during the process, an operator makes an error while typing the lot number of an ingredient, the audit trail does not need record every time the operator may have pressed the backspace key or the subsequent keystrokes to correct the typing error prior to pressing the ‘‘return key’’ (where pressing the return key would cause the information to be saved to a disk file).  However, any subsequent ‘‘saved’’ corrections made after the data is committed to permanent storage, must be part of the audit trail.

For document records:

If the document is subject to review and approval, the audit trail begins upon approval and issuing the document.  A document record undergoing routine modifications, must be version controlled and be managed via a controlled change process. However, the interim changes which are performed in a controlled manner, i.e. during drafting or review comments collection do not need to be audit trailed.  Once the new version of a document record is issued, it will supersede all previous versions.

Questions from Auditors: Got Answers?

When was data locked? Can you find this information easily on your audit trail files?

When was the database/system released for the trial? Again, how easily can you run a query and find this information?

When did data entry by investigator (site personnel) commence?

When was access given to site staff?

Source:

Part of this article was taking, with permission, from Montrium – Understanding Audit Trail Requirements in Electronic GXP Systems

Fair Use Notice: Images/logos/graphics on this page contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law).

How to Avoid Electronic Data Integrity Issues: 7 Techniques for your Next Validation Project

The idea of this article was taking (with permission from the original authors) from Montrium:  how-to-avoid-electronic-data-integrity-issues-7-techniques-for-your-next-validation-project

Regulatory agencies around the globe are causing life science companies to be increasingly concerned with data integrity.  This comes with no surprise given that Guidance Documents for Data Integrity have been published by the MHRAFDA (draft), and WHO (draft).  In fact, the recent rise in awareness of the topic has been so tremendous that, less than two years after the original publication, the MHRA released a new draft of its guidance whose scope has been broadened from GMP to all GxP data.

Is data integrity an issue of good documentation practices? You can read GCP information about this topic here.

Good Documentation Practices for SAS / EDC Developers

Are you practising GCP?

In computerised systems, failures in data integrity management can arise from poor or complete lack of system controls.  Human error or lack of awareness may also cause data integrity issues.  Deficiencies in data integrity management are crucial because they may lead to issues with product quality and/or patient safety and, ultimately may manifest themselves through patient injury or even death.

I recently was at the vendor qualification tool that uses a hand held device to read data while the physician or expert manually put pressure on someone’s body parts (e..g. pain related). I was not impressed. Even though it seems like a nice device with its own software, the entire process was manual and therefore, questionable data integrity. The measurement seems to be all over the place and you would need the right personnel at the clinical site to perform a more accurate reading since again, it was all manual and dependent of someone else used of the device.

I also questioned the calibration of this device. The sale’s person answer ? “Well, it is reading 0 and therefore, it is calibrated.”….Really? You mean to tell me you have no way of proving when you perform calibration? Where is the paper trail proving your device is accurate? You mean to tell me I have to truth your words? Or your device’s screen that reads ‘0’? Well, I have news for you. Tell that to the regulators when they audit the trial.

What is Data Integrity?

Data can be defined as any original and true copy of paper or electronic records.  In the broadest sense, data integrity refers to the extent to which data are complete, consistent and accurate.

To have integrity and to meet regulatory expectations, data must at least meet the ALCOA criteria. Data that is ALCOA-plus is even better.

Alcoa

 

What is a Computerised System?

computerised system is not only the set of hardware and software, but also includes the people and documentation (including user guides and operating procedures) that are used to accomplish a set of specific functions.  It is a regulatory expectation that computer hardware and software are qualified, while the complete computerised system is validated to demonstrate that it is fit for its intended use.

How can you demonstrate Electronic Data Integrity through Validation?

Here are some techniques to assist you in ensuring the reliability of GxP data generated and maintained in computerised systems.

Specifications

What to do

Why you should do this

Outline your expectations for data integrity within a requirements specification.

For example:

  • Define requirements for the data review processes.
  • Define requirements for data retention (retention period and data format).
Validation is meant to demonstrate a system’s fitness for intended use.  If you define requirements for data integrity, you will be more inclined to verify that both system and procedural controls for data integrity are in place.
Verify that the system has adequate technical controls to prevent unauthorised changes to the configuration settings.

For example:

  • Define the system configuration parameter within a configuration specification.
  • Verify that the system configuration is “locked” to end-users.  Only authorized administrators should have access to the areas of the system where configuration changes can be made.
The inspection agencies expect you to be able to reconstruct any of the activities resulting in the generation of a given raw data set.  A static system configuration is key to being able to do this.

 

Verification of Procedural Controls

What to do

Why you should do this

Confirm that procedures are in place to oversee the creation of user accounts.

For example:

  • Confirm that user accounts are uniquely tied to specific individuals.
  • Confirm that generic system administrator accounts have been disabled.
  • Confirm that user accounts can be disabled.
Shared logins or generic user accounts should not be used since these would render data non-attributable to individuals.

System administrator privileges (allowing activities such as data deletion or system configuration changes) should be assigned to unique named accounts.  Individuals with administrator access should log in under his named account that allows audit trails to be attributed to that specific individual.

Confirm that procedures are in place to oversee user access management.

For example:

  • Verify that a security matrix is maintained, listing the individuals authorized to access the system and with what privileges.
A security matrix is a visual tool for reviewing and evaluating whether appropriate permissions are assigned to an individual. The risk of tampering with data is reduced if users are restricted to areas of the system that solely allow them to perform their job functions.
Confirm that procedures are in place to oversee training.

For example:

  • Ensure that only qualified users are granted access to the system.
People make up the part of the system that is most prone to error (intentional or not).  Untrained or unqualified users may use the system incorrectly, leading to the generation of inaccurate data or even rendering the system inoperable.

Procedures can be implemented to instruct people on the correct usage of the system.  If followed, procedures can minimize data integrity issues caused by human error. Individuals should also be sensitized to the consequences and potential harm that could arise from data integrity issues resulting from system misuse.

Logical security procedures may outline controls (such as password policies) and codes of conduct (such as prohibition of password sharing) that contribute to maintaining data integrity.

 

Testing of Technical Controls

What to do

Why you should do this

Verify calculations performed on GxP data.

For example:

  • Devise a test scenario where input data is manipulated and double-check that the calculated output is exact.
When calculations are part of the system’s intended use, they must be verified to ensure that they produce accurate results.
Verify the system is capable of generating audit trails for GxP records.

For example:

  • Devise a test scenario where data is created, modified, and deleted.  Verify each action is captured in a computer-generated audit trail.
  • Verify the audit trail includes the identity of the user performing the action on the record
  • Verify the audit trail includes a time stamp
  • Verify the system time zone settings and synchronisation.
With the intent of minimizing the falsification of data, GxP record-keeping practices prevent data from being lost or obscured.  Audit trails capture who, when and why a record was created, modified or deleted.  The record’s chronology allows for reconstruction of the course of events related to the record.

The content of the audit trails ensures that data is always attributable and contemporaneous.

For data and the corresponding audit trails to be contemporaneous, system time settings must be accurate.

 

 

 

Who can delete data?

Adequately validated and have sufficient controls to
prevent unauthorized access or changes to data.

Implement a data integrity lifecycle concept:

  • Activate audit trail and its backup
  • Backup and archiving processes
  • Disaster recovery plan
  • Verification of restoration of raw data
  • Security, user access and role privileges (Admin)

Warning Signs – Red Flags

  • Design and configuration of systems are poor
  • Data review limited to printed records – no review
    of e-source data
  • System administrators during QC, can delete data (no proper documentation)
  • Shared Identity/Passwords
  • Lack of culture of quality
  • Poor documentation practices
  • Old computerized systems not complying with part 11 or Annex 11
  • Lack of audit trail and data reviews
  • Is QA oversight lacking? Symptom of weak QMS?
I love being audited

 

 

 

 

 

 

Perform Self Audits

  • Focus on raw data handling & data review/verification
  • Consider external support to avoid bias
  • Verify the expected sequence of activities: dates,
    times, quantities, identifiers (such as batch,
    sample or equipment numbers) and signatures
  • Constantly double check and cross reference
  • Verify signatures against a master signature list
  • Check source of materials received
  • Review batch record for inconsistencies
  • Interview staff not the managers

FDA 483 observations

“…over-writing electronic raw data…..”

“…OOS not investigated as required by SOP….”

“….records are not completed contemporaneously”

“… back-dating….”

“… fabricating data…”

“…. No saving electronic or hard copy data…”

“…results failing specifications are retested until
acceptable results are obtained….”

  • No traceability of reported data to source documents

Conclusion:

Even though we try to comply with regulations (regulatory expectations from different agencies e.g. EMA, MHRA, FDA, etc), data integrity is not always easy to detect. It is important the staff working in a regulated environment be properly trained and continuous refresher provided through their career (awareness training of new regulations and updates to regulations).

Companies should also integrate a self-audit program and develop a strong quality culture by implementing lesson learned from audits.

Sources:

You can read more about data integrity findings by searching the followng topics:

MHRA GMP Data Integrity Definitions & Guidance for the Industry,
MHRA DI blogs: org behaviour, ALCOA principles
FDA Warning Letters and Import Alerts
EUDRA GMDP database noncompliance

The Mind-Numbing Way FDA Uncovers Data
Integrity Laps”, Gold Sheet, 30 January 2015

Data Integrity Pitfalls – Expectations and Experiences

Fair Use Notice: Images/logos/graphics on this page contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law)

Top 3 Posts at (EDC Developer)

Fist, I would like to thank everyone who has read articles posted at {EDC} Developer. Especially, my colegas and friends from India. The highest reading and hits have come from people living in India.

New to the industry? Want to get in as clinical data manager or clinical programmer? Looking for a particular topic or an answer to a question? check the contact me section.

Here are the top most searched articles this past few months:

1- Data Management: Queries in Clinical Trials

2- How to document the testing done on the edit checks?

3- Why use JReview for your Clinical Trials?

Others most read articles:

Role of Project Management and the Project Manager in Clinical Data Management

4 Programming Languages You Should Learn Right Now (eClinical Speaking)

Data Management Plan in Clinical Trials

For the search term used to find {EDC} Developer:

1-types of edit checks in clinical data management

2-Rave programming

3- pharmaceutical terminology list

4-seeking rave training (better source is mdsol.com)

5- edc programmer

6-central design tips and tricks

Thank you for reading!

Data Management Plan – Coding and Reconciliation

All Adverse Events and Previous/Concomitant Medication should be coded and/or approved prior and during the trial.

Before adverse event terms can be reported or analyzed, they must be grouped based on their similarities. For example, headache, mild headache and acute head should all be counted as the same kind of event. This is done by matching (or coding) the reported adverse events against a large codelist of adverse events which is also known as dictionary or thesaurus.

Test cases and other documentation associated with the testing of auto-coding should be produced/documented.  This documentation is not part of the plan. It is a product of the design process and should be filed separately in the TMF system.

In the DMP. you should document the variables and the dictionary to be used.

For Concomitant Medications, WHO drug reference list is used.  Also document the version used and if applicable, the final version of the who drug (for trials running over 6 months).

For Adverse event, MedDRA dictionary is the choice of coding method. Document the version used.

Serious Adverse Event (SAE) Reconciliation:

Indicate SAE Reconciling Approach to be used to compare SAE database (e.g. Argus) to the Clinical study| database (e.g. EDC):

  • Indicate tools to be used
  • Location of SAE data
  • Planned timing
  • Planned frequency of SAE Reconciliation activities

What to look for during reconciliation:

  • There are matched cases but minor differences such as onset date
  • Case found in the CDMS but not in the SAE system
  • Case found in the SAE system but not in the CDM system

Methods for Reconciliation:

For electronic-automatic reconciliation between systems, there are some challenges you need to identify first such as which type of data is to be reconciled and then which fields to compare. Best practice is to reconciled those considered serious according to regulatory definitions.

For manual reconciliation, reports such as SAS listings extracted from both systems with study information, subject or investigator and other key data can be used to perform manual review.  A manual comparison of the events can then assure that they are both complete and comparable.

Central Coding Anayansi Gamboa
Central Coding

No matter which method you used for reconciliation, each type of data (eg, AE, MedHist, Conmed) should document which glossaries and version were used.

When data from the clinical trial database is entered into a drug safety database for coding, the data between the two systems should be reconciled to verify the data in both systems are

identical. The processes and frequency of reconciliation should be specified.

Source:

DIA -A Model Data Management Plan StandardOperating Procedure: Results From

the DIA Clinical Data Management Community, Committee on Clinical Data Management Plan

-FAIR ;USE-
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

Anayansi Gamboa has an extensive background in clinical data management as well as experience with different EDC systems including Oracle InForm, InForm Architect, Central Designer, CIS, Clintrial, Medidata Rave, Central Coding, Medrio, IBM eCOS, OpenClinica Open Source and Oracle Clinical.

Data Management Plan – Study Specific Documents

Data Management personnel are responsible for creating, collecting, maintaining and/or retaining all essential study documents when contracted by the sponsor (e.g. biotech company, big pharma client).

It is important to keep electronic and paper records or hard-copies and specify retention records of these essential documents:

  • Final version including amendments of the clinical protocol
  • Final version of the CRF/eCRFs
  • Final version of the completion guidelines
  • All final approvals and written authorization (e.g. emails or note to files).
Study Specific Anayansi Gamboa
Study specific

-FAIR ;USE-
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

Anayansi Gamboa has an extensive background in clinical data management as well as experience with different EDC systems including Oracle InForm, InForm Architect, Central Designer, CIS, Clintrial, Medidata Rave, Central Coding, Medrio, IBM eCOS, OpenClinica Open Source and Oracle Clinical.

Data Management Plan – Database Archive

Indicate how you intend to archive and share your data and why you have chosen that particular option.

The DMP should outline specific information regarding the organization’s procedures for archiving the electronic records.

Good practice for digital preservation requires that an organization address succession planning for digital assets.

Which criteria will you use to decide which data has to be archived? What should be included in the archive?

Type of data (raw, processed) and how easy it is to reproduce it. Also consider archiving audit trails as long as the records are (CRF Part 11, Section 11.10).

Does the archive have specific requirements concerning file formats, metadata etc.

It is recommended to use open source formats such as PDF-PDF/A, ODM-XML or ASCII type of files.

anayansigamboa

 

 

 

 

Who is responsible for the data after the project ends?

Sponsor, CRO, Vendor? All should be documented on the DMP. Once database is locked, within a reasonable time and after data submission to a regulatory agency, you want to archive your database for long term storage and recovery.

While most data submitted to regulatory agencies are available in SAS formats, there may be times when going back to the original data format may be required.

Even though the easiest way to make sure data is available after database lock is to archive this data in the built in structure as the current system. For example, for Medidata Rave studies, trials are built on on top of SQL server, hence, you should consider archiving the old studies in a compatible format of SQL Server, without any transformation or data manipulation = raw data.

Other formats for data archive can be considered are ODM XML, PDF-PDF/A or ASCII A-8. These are some options for long=term storage. FDA says in the guidance document for 21 CFR Part 11, ‘scope and application – section C.5″, “FDA does not intend to object inf you decide to archive required records in electronic format to nonelectronic media….As long as predicate rule requirements are fully satisfied and the content and meaning of the records are preserved and archived, you can delete the electronic version of the records”.

Archival Plan

For archiving data, this plan should list all the components of the orginal system that will be included in the archive and the formats being used for their storage.

The best practices for clinical data archiving in clinical research are no different from those for archiving any other kind of industry.

 

 

-FAIR ;USE-
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

Anayansi Gamboa has an extensive background in clinical data management as well as experience with different EDC systems including Oracle InForm, InForm Architect, Central Designer, CIS, Clintrial, Medidata Rave, Central Coding, OpenClinica Open Source and Oracle Clinical.

Data Management Plan – Protocol Summary

This usually describes the management plan for the data collected  during the project. It is a brief description or synopsis of  the protocol.

The protocol, in terms of a clinical research study, is the plan, or blueprint, that
describes the study’s objectives, methodology, statistical considerations, and the organization of the study. [CDISC.org Oct. 2012]

Protocol Summary
Protocol Summary – current state of ‘standardization’ of a protocol document

 

 

 

 

 

 

 

 

 

 

 

 

 

What to look for when reading a protocol?

  • Review of T&E – Time and Event Schedule or Visit Schedule.
  • Assessments e.g. ECGs, PE (physical exams), MH-MedHix or Medical HIstory, labs and more.
  • Critical data variables for analysis. e.g. efficacy and safety data

 

proc print data= work.demog;
where patient in(“&pid”) and page=’3′;
var patient SBJINT page
dob sex bmi weight height;
title ‘Page 3 – Demog’;
run;

-FAIR ;USE-
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

Anayansi Gamboa has an extensive background in clinical data management as well as experience with different EDC systems including Oracle InForm, InForm Architect, Central Designer, CIS, Clintrial, Medidata Rave, Central Coding, Medrio, IBM eCOS, OpenClinica Open Source and Oracle Clinical.