Motto of the Month

Enjoy every moment as it is your last.  The simple things in life are the most important ones. Once you don’t see or have them any more, you realize, you had it good.

You don’t know what you got until it is gone…

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

FAIR USE
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

Advertisements

What Goes Around…Comes Around

In and Out of Love – Armin van Buuren feat. Sharon den Adel

Fine Without You -Armin van Buuren feat. Jennifer Rene

Tiesto – Love Comes Again

 

Ik ben met veel mensen geweest en op het einde realiseer je dat ‘te zijn’ alleen maar met één persoon kan gebeuren. Kerst, .. zomers, ontbijt
Ik doe het op mijn manier!

Fair Use Notice: Images/logos/graphics on this page contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law).

Yvonne Koelemeijer

Understanding Audit Trail Requirements in Electronic GxP Systems

Computerized systems are used throughout the life sciences industry to support various regulated activities, which in turn generate many types of electronic records.  These electronic records must be maintained according to regulatory requirements contained within FDA’s 21 CFR Part 11 for US jurisdictions and Eudralex Volume 4 Annex 11 for EU jurisdictions.  Therefore, we must ensure the GxP system which maintains the electronic record(s) is capable of meeting these regulatory requirements.

What to look for in Audit Trail?

  • Is the audit trail activated? SOP?
  • Record of reviews? (most companies trust the electronic systems audit trail and generates electronic paper version of it without a full review)
  • How to prevent or detect any deletion or modification
    of audit trail data? Training of staff?
  • Filter of audit trail

Can you prove data manipulation did not occur?

Persons must still comply with all applicable predicate rule requirements related to documentation of, for example, date (e.g. 58.130(e)), time, or sequencing of events, as well as any requirements for ensuring that changes to records do not obscure previous entries.

Consideration should be given, based on a risk assessment, to building into the system the creation of a record of all GMP-relevant changes and deletions (a system generated “audit trail”).

Audit trail content:

Audit trail content and reason it is required:
Identification of the User making the entry This is needed to ensure traceability.  This could be a user’s unique ID, however there should be a way of correlating this ID to the person.
Date and Time Stamp This is a critical element in documenting a sequence of events and vital to establishing an electronic record’s trustworthiness and reliability.  It can also be effective deterrent to records falsification.
Link to Record This is needed to ensure traceability.  This could be the record’s unique ID.
Original Value  

This is needed in order to be able to have a complete history and to be able reconstruct the sequence of events

New Value
Reason for Change This is only required if stipulated by the regulations pertaining to the audit trailed record.  (See below)

FDA / Regulators findings and complaints during Inspection of Audit Trail Data:

  • Audit User sometimes is hard to describe (e.g. user123 instead use full names of each user IDs thus requirement additional mapping)
  • Field IDs or Variables names are used instead of SAS labels or Field Labels (map field names with respective field text (e.g.  AETERM displayed instead use Reported Term for the Adverse Event)
  • Default values should be easily explained or meaningful (see annotated CRF)
  • Limited access to audit trail files (many systems with different reporting tools or extraction tool. Data is not fully integrated. Too many files and cannot be easily integrated).
  • No audit trail review process. Be prepared to update SOPs or current working practices to add review time of audit trails. It is expected that at least, every 90 days, qualified staff performed a review of the audit trail for their trials. Proper documentation, filing and signature should be in place.
  • Avoid using Excel or CSV files. Auditors are now asking for SAS datasets of the audit trails. Auditors are getting trained to generate their own output based on pre-defined set of parameters to allow auditors to summarize data and produce graphs.
  • Formatting issues when exporting into Excel, for example.  Numbers and dates fields change it to text fields.
Audit Trail Review

What data must be “audit trailed”?

When it comes to determining on which data the audit trail must be applied, the regulatory agencies (i.e. FDA and EMA) recommend following a risk based approach.

Following a “risk based approach”

In 2003, the FDA issued recommendations for compliance with 21 CFR Part 11 in the “Guidance for Industry – Part 11, Electronic Records; Electronic Signatures — Scope and Application” (see reference: Ref. [04]).  This guidance narrowed the scope of 21 CFR Part 11 and identified portions of the regulations where the agency would apply enforcement discretion, including audit trails. The agency recommends considering the following when deciding whether to apply audit trails:

  • Need to comply with predicate rule requirements
  • Justified and documented risk assessment to determine the potential effect on product quality
  • product safety
  • record integrity

With respect to predicate rule requirements, the agency states, “Persons must still comply with all applicable predicate rule requirements related to documentation of, for example, date (e.g., § 58.130(e)), time, or sequencing of events, as well as any requirements for ensuring that changes to records do not obscure previous entries.”  In the docket concerning the 21 CFR Part 11 Final Rule, the FDA states, “in general, the kinds of operator actions that need to be covered by an audit trail are those important enough to memorialize in the electronic record itself.” These are actions which would typically be recorded in corresponding paper records according to existing recordkeeping requirements.

The European regulatory agency also recommends following a risk based approach.  The Eudralex Annex 11 regulations state, “consideration should be given, based on a risk assessment, to building into the system the creation of a record of all GMP-relevant changes and deletions (a system generated “audit trail”).”

MHRA Audit

When does the Audit Trail begin?

The question of when to begin capturing audit trail information comes up quite often, as audit trail initiation requirements differ for data and document records.

For data records:

If the data is recorded directly to electronic storage by a person, the audit trail begins the instant the data hits the durable media.  It should be noted, that the audit trail does not need to capture every keystroke that is made before the data is committed to permanent storage. This can be illustrated in the following example involving a system that manages information related to the manufacturing of active pharmaceutical ingredients.  If during the process, an operator makes an error while typing the lot number of an ingredient, the audit trail does not need record every time the operator may have pressed the backspace key or the subsequent keystrokes to correct the typing error prior to pressing the ‘‘return key’’ (where pressing the return key would cause the information to be saved to a disk file).  However, any subsequent ‘‘saved’’ corrections made after the data is committed to permanent storage, must be part of the audit trail.

For document records:

If the document is subject to review and approval, the audit trail begins upon approval and issuing the document.  A document record undergoing routine modifications, must be version controlled and be managed via a controlled change process. However, the interim changes which are performed in a controlled manner, i.e. during drafting or review comments collection do not need to be audit trailed.  Once the new version of a document record is issued, it will supersede all previous versions.

Questions from Auditors: Got Answers?

When was data locked? Can you find this information easily on your audit trail files?

When was the database/system released for the trial? Again, how easily can you run a query and find this information?

When did data entry by investigator (site personnel) commence?

When was access given to site staff?

Source:

Part of this article was taking, with permission, from Montrium – Understanding Audit Trail Requirements in Electronic GXP Systems

Fair Use Notice: Images/logos/graphics on this page contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law).

How to Avoid Electronic Data Integrity Issues: 7 Techniques for your Next Validation Project

The idea of this article was taking (with permission from the original authors) from Montrium:  how-to-avoid-electronic-data-integrity-issues-7-techniques-for-your-next-validation-project

Regulatory agencies around the globe are causing life science companies to be increasingly concerned with data integrity.  This comes with no surprise given that Guidance Documents for Data Integrity have been published by the MHRAFDA (draft), and WHO (draft).  In fact, the recent rise in awareness of the topic has been so tremendous that, less than two years after the original publication, the MHRA released a new draft of its guidance whose scope has been broadened from GMP to all GxP data.

Is data integrity an issue of good documentation practices? You can read GCP information about this topic here.

Good Documentation Practices for SAS / EDC Developers

Are you practising GCP?

In computerised systems, failures in data integrity management can arise from poor or complete lack of system controls.  Human error or lack of awareness may also cause data integrity issues.  Deficiencies in data integrity management are crucial because they may lead to issues with product quality and/or patient safety and, ultimately may manifest themselves through patient injury or even death.

I recently was at the vendor qualification tool that uses a hand held device to read data while the physician or expert manually put pressure on someone’s body parts (e..g. pain related). I was not impressed. Even though it seems like a nice device with its own software, the entire process was manual and therefore, questionable data integrity. The measurement seems to be all over the place and you would need the right personnel at the clinical site to perform a more accurate reading since again, it was all manual and dependent of someone else used of the device.

I also questioned the calibration of this device. The sale’s person answer ? “Well, it is reading 0 and therefore, it is calibrated.”….Really? You mean to tell me you have no way of proving when you perform calibration? Where is the paper trail proving your device is accurate? You mean to tell me I have to truth your words? Or your device’s screen that reads ‘0’? Well, I have news for you. Tell that to the regulators when they audit the trial.

What is Data Integrity?

Data can be defined as any original and true copy of paper or electronic records.  In the broadest sense, data integrity refers to the extent to which data are complete, consistent and accurate.

To have integrity and to meet regulatory expectations, data must at least meet the ALCOA criteria. Data that is ALCOA-plus is even better.

Alcoa

 

What is a Computerised System?

computerised system is not only the set of hardware and software, but also includes the people and documentation (including user guides and operating procedures) that are used to accomplish a set of specific functions.  It is a regulatory expectation that computer hardware and software are qualified, while the complete computerised system is validated to demonstrate that it is fit for its intended use.

How can you demonstrate Electronic Data Integrity through Validation?

Here are some techniques to assist you in ensuring the reliability of GxP data generated and maintained in computerised systems.

Specifications

What to do

Why you should do this

Outline your expectations for data integrity within a requirements specification.

For example:

  • Define requirements for the data review processes.
  • Define requirements for data retention (retention period and data format).
Validation is meant to demonstrate a system’s fitness for intended use.  If you define requirements for data integrity, you will be more inclined to verify that both system and procedural controls for data integrity are in place.
Verify that the system has adequate technical controls to prevent unauthorised changes to the configuration settings.

For example:

  • Define the system configuration parameter within a configuration specification.
  • Verify that the system configuration is “locked” to end-users.  Only authorized administrators should have access to the areas of the system where configuration changes can be made.
The inspection agencies expect you to be able to reconstruct any of the activities resulting in the generation of a given raw data set.  A static system configuration is key to being able to do this.

 

Verification of Procedural Controls

What to do

Why you should do this

Confirm that procedures are in place to oversee the creation of user accounts.

For example:

  • Confirm that user accounts are uniquely tied to specific individuals.
  • Confirm that generic system administrator accounts have been disabled.
  • Confirm that user accounts can be disabled.
Shared logins or generic user accounts should not be used since these would render data non-attributable to individuals.

System administrator privileges (allowing activities such as data deletion or system configuration changes) should be assigned to unique named accounts.  Individuals with administrator access should log in under his named account that allows audit trails to be attributed to that specific individual.

Confirm that procedures are in place to oversee user access management.

For example:

  • Verify that a security matrix is maintained, listing the individuals authorized to access the system and with what privileges.
A security matrix is a visual tool for reviewing and evaluating whether appropriate permissions are assigned to an individual. The risk of tampering with data is reduced if users are restricted to areas of the system that solely allow them to perform their job functions.
Confirm that procedures are in place to oversee training.

For example:

  • Ensure that only qualified users are granted access to the system.
People make up the part of the system that is most prone to error (intentional or not).  Untrained or unqualified users may use the system incorrectly, leading to the generation of inaccurate data or even rendering the system inoperable.

Procedures can be implemented to instruct people on the correct usage of the system.  If followed, procedures can minimize data integrity issues caused by human error. Individuals should also be sensitized to the consequences and potential harm that could arise from data integrity issues resulting from system misuse.

Logical security procedures may outline controls (such as password policies) and codes of conduct (such as prohibition of password sharing) that contribute to maintaining data integrity.

 

Testing of Technical Controls

What to do

Why you should do this

Verify calculations performed on GxP data.

For example:

  • Devise a test scenario where input data is manipulated and double-check that the calculated output is exact.
When calculations are part of the system’s intended use, they must be verified to ensure that they produce accurate results.
Verify the system is capable of generating audit trails for GxP records.

For example:

  • Devise a test scenario where data is created, modified, and deleted.  Verify each action is captured in a computer-generated audit trail.
  • Verify the audit trail includes the identity of the user performing the action on the record
  • Verify the audit trail includes a time stamp
  • Verify the system time zone settings and synchronisation.
With the intent of minimizing the falsification of data, GxP record-keeping practices prevent data from being lost or obscured.  Audit trails capture who, when and why a record was created, modified or deleted.  The record’s chronology allows for reconstruction of the course of events related to the record.

The content of the audit trails ensures that data is always attributable and contemporaneous.

For data and the corresponding audit trails to be contemporaneous, system time settings must be accurate.

 

 

 

Who can delete data?

Adequately validated and have sufficient controls to
prevent unauthorized access or changes to data.

Implement a data integrity lifecycle concept:

  • Activate audit trail and its backup
  • Backup and archiving processes
  • Disaster recovery plan
  • Verification of restoration of raw data
  • Security, user access and role privileges (Admin)

Warning Signs – Red Flags

  • Design and configuration of systems are poor
  • Data review limited to printed records – no review
    of e-source data
  • System administrators during QC, can delete data (no proper documentation)
  • Shared Identity/Passwords
  • Lack of culture of quality
  • Poor documentation practices
  • Old computerized systems not complying with part 11 or Annex 11
  • Lack of audit trail and data reviews
  • Is QA oversight lacking? Symptom of weak QMS?
I love being audited

 

 

 

 

 

 

Perform Self Audits

  • Focus on raw data handling & data review/verification
  • Consider external support to avoid bias
  • Verify the expected sequence of activities: dates,
    times, quantities, identifiers (such as batch,
    sample or equipment numbers) and signatures
  • Constantly double check and cross reference
  • Verify signatures against a master signature list
  • Check source of materials received
  • Review batch record for inconsistencies
  • Interview staff not the managers

FDA 483 observations

“…over-writing electronic raw data…..”

“…OOS not investigated as required by SOP….”

“….records are not completed contemporaneously”

“… back-dating….”

“… fabricating data…”

“…. No saving electronic or hard copy data…”

“…results failing specifications are retested until
acceptable results are obtained….”

  • No traceability of reported data to source documents

Conclusion:

Even though we try to comply with regulations (regulatory expectations from different agencies e.g. EMA, MHRA, FDA, etc), data integrity is not always easy to detect. It is important the staff working in a regulated environment be properly trained and continuous refresher provided through their career (awareness training of new regulations and updates to regulations).

Companies should also integrate a self-audit program and develop a strong quality culture by implementing lesson learned from audits.

Sources:

You can read more about data integrity findings by searching the followng topics:

MHRA GMP Data Integrity Definitions & Guidance for the Industry,
MHRA DI blogs: org behaviour, ALCOA principles
FDA Warning Letters and Import Alerts
EUDRA GMDP database noncompliance

The Mind-Numbing Way FDA Uncovers Data
Integrity Laps”, Gold Sheet, 30 January 2015

Data Integrity Pitfalls – Expectations and Experiences

Fair Use Notice: Images/logos/graphics on this page contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law)

Top 3 Posts at (EDC Developer)

Fist, I would like to thank everyone who has read articles posted at {EDC} Developer. Especially, my colegas and friends from India. The highest reading and hits have come from people living in India.

New to the industry? Want to get in as clinical data manager or clinical programmer? Looking for a particular topic or an answer to a question? check the contact me section.

Here are the top most searched articles this past few months:

1- Data Management: Queries in Clinical Trials

2- How to document the testing done on the edit checks?

3- Why use JReview for your Clinical Trials?

Others most read articles:

Role of Project Management and the Project Manager in Clinical Data Management

4 Programming Languages You Should Learn Right Now (eClinical Speaking)

Data Management Plan in Clinical Trials

For the search term used to find {EDC} Developer:

1-types of edit checks in clinical data management

2-Rave programming

3- pharmaceutical terminology list

4-seeking rave training (better source is mdsol.com)

5- edc programmer

6-central design tips and tricks

Thank you for reading!

Data Management Plan – Coding and Reconciliation

All Adverse Events and Previous/Concomitant Medication should be coded and/or approved prior and during the trial.

Before adverse event terms can be reported or analyzed, they must be grouped based on their similarities. For example, headache, mild headache and acute head should all be counted as the same kind of event. This is done by matching (or coding) the reported adverse events against a large codelist of adverse events which is also known as dictionary or thesaurus.

Test cases and other documentation associated with the testing of auto-coding should be produced/documented.  This documentation is not part of the plan. It is a product of the design process and should be filed separately in the TMF system.

In the DMP. you should document the variables and the dictionary to be used.

For Concomitant Medications, WHO drug reference list is used.  Also document the version used and if applicable, the final version of the who drug (for trials running over 6 months).

For Adverse event, MedDRA dictionary is the choice of coding method. Document the version used.

Serious Adverse Event (SAE) Reconciliation:

Indicate SAE Reconciling Approach to be used to compare SAE database (e.g. Argus) to the Clinical study| database (e.g. EDC):

  • Indicate tools to be used
  • Location of SAE data
  • Planned timing
  • Planned frequency of SAE Reconciliation activities

What to look for during reconciliation:

  • There are matched cases but minor differences such as onset date
  • Case found in the CDMS but not in the SAE system
  • Case found in the SAE system but not in the CDM system

Methods for Reconciliation:

For electronic-automatic reconciliation between systems, there are some challenges you need to identify first such as which type of data is to be reconciled and then which fields to compare. Best practice is to reconciled those considered serious according to regulatory definitions.

For manual reconciliation, reports such as SAS listings extracted from both systems with study information, subject or investigator and other key data can be used to perform manual review.  A manual comparison of the events can then assure that they are both complete and comparable.

Central Coding Anayansi Gamboa
Central Coding

No matter which method you used for reconciliation, each type of data (eg, AE, MedHist, Conmed) should document which glossaries and version were used.

When data from the clinical trial database is entered into a drug safety database for coding, the data between the two systems should be reconciled to verify the data in both systems are

identical. The processes and frequency of reconciliation should be specified.

Source:

DIA -A Model Data Management Plan StandardOperating Procedure: Results From

the DIA Clinical Data Management Community, Committee on Clinical Data Management Plan

-FAIR ;USE-
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

Anayansi Gamboa has an extensive background in clinical data management as well as experience with different EDC systems including Oracle InForm, InForm Architect, Central Designer, CIS, Clintrial, Medidata Rave, Central Coding, Medrio, IBM eCOS, OpenClinica Open Source and Oracle Clinical.

Data Management Plan – Study Specific Documents

Data Management personnel are responsible for creating, collecting, maintaining and/or retaining all essential study documents when contracted by the sponsor (e.g. biotech company, big pharma client).

It is important to keep electronic and paper records or hard-copies and specify retention records of these essential documents:

  • Final version including amendments of the clinical protocol
  • Final version of the CRF/eCRFs
  • Final version of the completion guidelines
  • All final approvals and written authorization (e.g. emails or note to files).
Study Specific Anayansi Gamboa
Study specific

-FAIR ;USE-
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

Anayansi Gamboa has an extensive background in clinical data management as well as experience with different EDC systems including Oracle InForm, InForm Architect, Central Designer, CIS, Clintrial, Medidata Rave, Central Coding, Medrio, IBM eCOS, OpenClinica Open Source and Oracle Clinical.

Data Management Plan – Database Archive

Indicate how you intend to archive and share your data and why you have chosen that particular option.

The DMP should outline specific information regarding the organization’s procedures for archiving the electronic records.

Good practice for digital preservation requires that an organization address succession planning for digital assets.

Which criteria will you use to decide which data has to be archived? What should be included in the archive?

Type of data (raw, processed) and how easy it is to reproduce it. Also consider archiving audit trails as long as the records are (CRF Part 11, Section 11.10).

Does the archive have specific requirements concerning file formats, metadata etc.

It is recommended to use open source formats such as PDF-PDF/A, ODM-XML or ASCII type of files.

anayansigamboa

 

 

 

 

Who is responsible for the data after the project ends?

Sponsor, CRO, Vendor? All should be documented on the DMP. Once database is locked, within a reasonable time and after data submission to a regulatory agency, you want to archive your database for long term storage and recovery.

While most data submitted to regulatory agencies are available in SAS formats, there may be times when going back to the original data format may be required.

Even though the easiest way to make sure data is available after database lock is to archive this data in the built in structure as the current system. For example, for Medidata Rave studies, trials are built on on top of SQL server, hence, you should consider archiving the old studies in a compatible format of SQL Server, without any transformation or data manipulation = raw data.

Other formats for data archive can be considered are ODM XML, PDF-PDF/A or ASCII A-8. These are some options for long=term storage. FDA says in the guidance document for 21 CFR Part 11, ‘scope and application – section C.5″, “FDA does not intend to object inf you decide to archive required records in electronic format to nonelectronic media….As long as predicate rule requirements are fully satisfied and the content and meaning of the records are preserved and archived, you can delete the electronic version of the records”.

Archival Plan

For archiving data, this plan should list all the components of the orginal system that will be included in the archive and the formats being used for their storage.

The best practices for clinical data archiving in clinical research are no different from those for archiving any other kind of industry.

 

 

-FAIR ;USE-
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

Anayansi Gamboa has an extensive background in clinical data management as well as experience with different EDC systems including Oracle InForm, InForm Architect, Central Designer, CIS, Clintrial, Medidata Rave, Central Coding, OpenClinica Open Source and Oracle Clinical.

Data Management Plan – Protocol Summary

This usually describes the management plan for the data collected  during the project. It is a brief description or synopsis of  the protocol.

The protocol, in terms of a clinical research study, is the plan, or blueprint, that
describes the study’s objectives, methodology, statistical considerations, and the organization of the study. [CDISC.org Oct. 2012]

Protocol Summary
Protocol Summary – current state of ‘standardization’ of a protocol document

 

 

 

 

 

 

 

 

 

 

 

 

 

What to look for when reading a protocol?

  • Review of T&E – Time and Event Schedule or Visit Schedule.
  • Assessments e.g. ECGs, PE (physical exams), MH-MedHix or Medical HIstory, labs and more.
  • Critical data variables for analysis. e.g. efficacy and safety data

 

proc print data= work.demog;
where patient in(“&pid”) and page=’3′;
var patient SBJINT page
dob sex bmi weight height;
title ‘Page 3 – Demog’;
run;

-FAIR ;USE-
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

Anayansi Gamboa has an extensive background in clinical data management as well as experience with different EDC systems including Oracle InForm, InForm Architect, Central Designer, CIS, Clintrial, Medidata Rave, Central Coding, Medrio, IBM eCOS, OpenClinica Open Source and Oracle Clinical.

 

Using PROC UNIVARIATE to Validate Clinical Data

Using PROC UNIVARIATE to Validate Clinical Data

When your data isn’t clean, you need to locate the errors and validate them.  We can use SAS Procedures to determine whether or not the data is clean. Today, we will cover the PROC  UNIVARIATE procedure.

  • First step is to identify the errors in a raw data file. Usually, in our DMP, in the DVP/DVS section, we can identify what it is considered ‘clean’ or data errors.
    • Study your data
  • Then validate using PROC UNIVARIATE procedure.
  • Find extreme values

When you validate your data, you are looking for:

  • Missing values
  • Invalid values
  • Out-of-ranges values
  • Duplicate values

Previously, we used PROC FREQ to find missing/unique values. Today, we will use PROC UNIVARIATE which is useful for finding data outliers, which are data that falls outside expected values.

proc univariate data=labdata nextrobs=10;
var LBRESULT;
run;

Lab data result using Univariate

 

 

 

 

 

 

 

 

 

For validating data, you will be more interested in the last two tables from this report. The missing values table shows that the variable LBRESULT has 260 missing values. There are 457 observations. The extreme observations table can tell us the lowest and highest values (possible outliers) from our dataset. The nextrobs=10 specify the number of extreme observations to display on the report. To suppress it use nextrobs=0.

To hire me for services, you may contact me via Contact Me OR Join me on LinkedIn

 

Using PROC FREQ to Validate Clinical Data

Using PROC FREQ to Validate Clinical Data

When your data isn’t clean, you need to locate the errors and validate them.  We can use SAS Procedures to determine whether or not the data is clean. Today, we will cover the PROC FREQ procedure.

  • First step is to identify the errors in a raw data file. Usually, in our DMP, in the DVP/DVS section, we can identify what it is considered ‘clean’ or data errors.
    • Study your data
  • Then validate using PROC FREQ procedure.
  • Spot distinct values

When you validate your data, you are looking for:

  • Missing values
  • Invalid values
  • Out-of-ranges values
  • Duplicate values

Previously, we used PROC PRINT to find missing/invalid values. Today, we will use PROC FREQ  to view a frequency table of the unique values for a variable. The TABLES statement in a PROC FREQ step specified which frequency tables to produce.

proc freq data=labdataranges nlevels;
table _all_ / noprint;
run;

So how many unique lab test do we have on our raw data file? We know that our sas data set has 12 records. The Levels column from this report,  the labtest=3 uniques. Which means, we must have 9 duplicates labtest in total. For this type of data [lab ranges] though, this is correct. We are using it as an example as you can check any type of data.

Proc Freq sas

 

 

 

Lab test data ranges

 

 

 

 

 

 

 

 

 

 

 

 

So remember, to view the distinct values for a variable, you use PROC FREQ that produces frequency tables (nway/one way) . You can view the frequency, percent, cumulative frequency, and cumulative percentage. With the NLEVELS options, PROC FREQ displays a table that provides the number of distinct values for each variable name in the table statement.

Example: SEX variable has the correct values F or M as expected; however, it is missing for two observations.

Missing values proc freq

 

 

 

 

 

To hire me for services, you may contact me via Contact Me OR Join me on LinkedIn

Using PROC PRINT to Validate Clinical Data

Using PROC PRINT to Validate Clinical Data

When your data isn’t clean, you need to locate the errors and validate them.  We can use SAS Procedures to determine whether or not the data is clean. Today, we will cover the PROC PRINT procedure.

  • First step is to identify the errors in a raw data file. Usually, in our DMP, in the DVP/DVS section, we can identify what it is considered ‘clean’ or data errors.
    • Study your data
  • Then validate using PROC PRINT procedure.
  • We will clean the data using data set steps with assignments and IF-THEN-ELSE statements.

When you validate your data, you are looking for:

  • Missing values
  • Invalid values
  • Out-of-ranges values
  • Duplicate values

In the example below, our lab data ranges table we find missing values. We also would like to update the lab test to UPPER case.

Clinical Raw data
Proc Print data val code
PROC PRINT output – data validation

 

From the screenshot above, our PROC PRINT program identified all missing / invalid values as per our specifications. We need to clean up 6 observations.

Cleaning Data Using Assignment Statements and If-Then-Else in SAS

We can use the data step to update the datasets/tables/domains when there is an invalid or missing data as per protocol requirements.

In our example, we have a lab data ranges for a study that has started but certain information is missing or invalid.

To convert our lab test in upper case, we will use an assignment statement. For the rest of the data cleaning, we will use IF statements.

Proc Print data cleaning

 

 

 

 

 

 

 

Data Validation and data cleaning final dataset

 

 

 

 

 

 

 

From our final dataset, we can verify that there are no missing values. We converted our labTest in uppercase and we updated the unit and  EffectiveEnddate to k/cumm and 31DEC2025 respectively.

You cannot use PROC PRINT to detect values that are not unique. We will do that in our next blog ‘Using PROC FREQ to Validate Clinical Data’. To find duplicates/remove duplicates, check out my previous post-Finding Duplicate data.

or use a proc sort data=<dataset> out=sorted nodupkey equals; by ID; run;

To hire me for services, you may contact me via Contact Me OR Join me on LinkedIn

 

PM Hats – Six Thinking Hats in Project Management

Six Thinking Hats

Looking at a Decision From All Points of View

‘Six Thinking Hats’ is an important and powerful technique. It is used to look at decisions from a number of important perspectives. This forces you to move outside your habitual thinking style, and helps you to get a more rounded view of a situation.

This tool was created by Edward de Bono’s book ‘6 Thinking Hats‘.

Many successful people think from a very rational, positive viewpoint. This is part of the reason that they are successful. Often, though, they may fail to look at a problem from an emotional, intuitive, creative or negative viewpoint. This can mean that they underestimate resistance to plans, fail to make creative leaps and do not make essential contingency plans.

Similarly, pessimists may be excessively defensive, and more emotional people may fail to look at decisions calmly and rationally.

If you look at a problem with the ‘Six Thinking Hats’ technique, then you will solve it using all approaches. Your decisions and plans will mix ambition, skill in execution, public sensitivity, creativity and good contingency planning.

How to Use the Tool

You can use Six Thinking Hats in meetings or on your own. In meetings it has the benefit of blocking the confrontations that happen when people with different thinking styles discuss the same problem.

Each ‘Thinking Hat’ is a different style of thinking. These are explained below:

  • White Hat: neutral and objective, concerned with facts and figures
    With this thinking hat you focus on the data available. Look at the information you have, and see what you can learn from it. Look for gaps in your knowledge, and either try to fill them or take account of them.This is where you analyze past trends, and try to extrapolate from historical data.
  • Red Hat: the emotional view
    ‘Wearing’ the red hat, you look at problems using intuition, gut reaction, and emotion. Also try to think how other people will react emotionally. Try to understand the responses of people who do not fully know your reasoning.
  • Black Hat: careful and cautious, the “devil’s advocate” hat * 
    Using black hat thinking, look at all the bad points of the decision. Look at it cautiously and defensively. Try to see why it might not work. This is important because it highlights the weak points in a plan. It allows you to eliminate them, alter them, or prepare contingency plans to counter them.Black Hat thinking helps to make your plans ‘tougher’ and more resilient. It can also help you to spot fatal flaws and risks before you embark on a course of action. Black Hat thinking is one of the real benefits of this technique, as many successful people get so used to thinking positively that often they cannot see problems in advance. This leaves them under-prepared for difficulties.
  • Yellow Hat: sunny and positive 
    The yellow hat helps you to think positively. It is the optimistic viewpoint that helps you to see all the benefits of the decision and the value in it. Yellow Hat thinking helps you to keep going when everything looks gloomy and difficult.
  • Green Hat: associated with fertile growth, creativity, and new ideas
    The Green Hat stands for creativity. This is where you can develop creative solutions to a problem. It is a freewheeling way of thinking, in which there is little criticism of ideas. A whole range of creativity tools can help you here.
  • Blue Hat: cool, the color of the sky, above everything else-the organizing hat 
    The Blue Hat stands for process control. This is the hat worn by people chairing meetings. When running into difficulties because ideas are running dry, they may direct activity into Green Hat thinking. When contingency plans are needed, they will ask for Black Hat thinking, etc.

Exercise:

Here’s an exercise (inspired by Bono ideas) which will work very well with those who have been required to read Six Thinking Hats prior to getting together to brainstorm. Buy several of those delightful Dr. Seuss hats (at least one of each of the six different colors, more if needed) and keep the hats out of sight until everyone is seated. Review the agenda. Review what de Bono says about what each color represents. Then distribute the Dr. Seuss hats, making certain that someone is wearing a hat of each color. Proceed with the discussion, chaired by a person wearing a Blue or White hat. It is imperative that whoever wears a Black hat, for example, be consistently negative and argumentative whereas whoever wears a Yellow must be consistently positive and supportive. After about 15-20 minutes, have each person change to a different colored hat. Resume discussion.

Six Thinking Hats” is about improving communication and decision-making in groups.

Summary: Bono puts thinking into steps: 1. Information 2. Benefits 3.Critical thinking 4. Feelings 5. Creative thinking 6. Thinking about the thinking and creating and action plan for implementation.

How would you incorporate the ‘Six Thinking Hats’ in clinical data management?

Reference:

Six Thinking Hats by Edward de Bono, 1999

http://www.mindtools.com

Fair Use Notice: This blog/article/video contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law).

If you wish to use this copyrighted material for purposes that go beyond fair use, you must obtain permission from the copyright owner. Fair Use notwithstanding we will immediately comply with any copyright owner who wants their material removed or modified, wants us to link to their website or wants us to add their photo.

Disclaimer: The EDC Developer blog is “one man’s opinion”. Anything that is said on the report is either opinion, criticism, information or commentary. If making any type of investment or legal decision it would be wise to contact or consult a professional before making that decision.

Disclaimer:De inhoud van deze columns weerspiegelen niet per definitie de mening van {EDC Developer}.

Data Management Plan in Clinical Trials

 

The preparation of the data management plan (DMP) is a simple, straightforward approach designed to promote and ensure comprehensive project planning.

The data management plan typically contains the following items. They are:

  1. Introduction/Purpose of the document
  2. Scope of application/Definitions
  3. Abbreviations
  4. Who/what/where/when
  5. Project Schedule/Major Project Milestones
  6. Updates of the DMP
  7. Appendix

The objective of this guidelines is to define the general content of the Data Management Plan (DMP) and the procedures for developing and maintaining this document.

The abbreviation section could include all acronyms used within a particular study for further clarification.

e.g. CRF = Case Report Form
TA = Therapeutic Area

The Who/What/Where/When section should describe the objective of the study specific data management plans for ABC study. This section provides detail information about the indications, the number of subjects planned for the study, countries participating in the clinical trial, monitoring guidelines (SDV) or partial SDV, if any CROs or 3rd party are involved in the study (e.g. IVRS, central labs), which database will be used to collect study information (e.g. Clintrial, Oracle Clinical, Medidata Rave or Inform EDC).

The Appendix provides a place to put supporting information, allowing the body of the DMP to be kept concise and at more summary levels. For example, you could document Database Access of team members, Self-evident correction plan, Data Entry plan if using Double-data entry systems or Paper-Based clinical trials systems.

Remember, this is a living document and must be updated throughout the course of the clinical trial.

If problems arise during the life of a project, our first hunch would be that the project was not properly planned.

Reference: Role of Project Management in Clinical Trials
Your comments and questions are valued and encouraged.
Anayansi Gamboa has an extensive background in clinical data management as well as experience with different EDC systems including Oracle InForm, InForm Architect, Central Designer, CIS, Clintrial, Medidata Rave, Central Coding, OpenClinica, Open Source and Oracle Clinical.

To hire me for services, you may contact me via Contact Me OR Join me on LinkedIn

Disclaimer: The legal entity on this blog is registered as Doing Business As (DBA) – Trade Name – Fictitious Name – Assumed Name as “GAMBOA”.

Data Management: Queries in Clinical Trials

When an item or variable has an error or a query raised against it, it is said to have a “discrepancy” or “query”.

All EDC systems have a discrepancy management tool or also refer to “edit check” or “validation check” that is programmed using any known programming language (i.e. PL/SQL, C# sharp, SQL, Python, etc).

So what is a ‘query’? A query is an error generated when a validation check detects a problem with the data. Validation checks are run automatically whenever a page is saved “submitted” and can identify problems with a single variable, between two or more variables on the same eCRF page, or between variables on different pages. A variable can have multiple validation checks associated with it.

Errors can be resolved in several ways:

  • by correcting the error – entering a new value for example or when the datapoint is updated
  • by marking the variable as correct – some EDC systems required additional response or you can raise a further query if you are not satisfied with the response

Dealing with queries
Queries can be issued and/or answered by a number of people involved in the trial. Some of the common setups are: CDM, CRA or monitors, Site or coordinators.

Types of Queries

  • Auto-Queries or Systems checks
  • Manual Queries
  • Coding Queries
  • SDV related Queries generated during a Monitor visit
  • External Queries – for external loaded data in SAS format

EDC Systems and Discrepancy Output Examples

InForm

Note: All queries are associated to a single data item relevant to that query.

RAVE

Note: Users are only able to see / perform an action on a query based on their
role and the permissions via Core Config.

Timaeus

Note: Queries are highlighted by a red outline and a Warning icon.

OpenClinica

Note: Extensive interfaces for data query.

Query Metrics – It is important to measure the performance of your clinical trials.
Metrics are the same for all clinical studies but not all EDC systems are the same. Standardized metrics encourage performance improvement, effectiveness, and efficiency. Some common metrics are:

  • Outstanding Query
  • Query Answer Time
  • Average Time to Query Resolution
  • Number of closed discrepancies on all ongoing studies

Data management’s experience with data queries in clinical trials

FAIR USE
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

Trademarks: InForm is a trademark or registered trademark of Oracle Corporation. Rave is a trademark or registered trademark of Medidata. Timaeus is a trademark or registered trademark of Cmed Clinical Research.


Anayansi Gamboa has an extensive background in clinical data management as well as experience with different EDC systems including Oracle InForm, InForm Architect, Central Designer, CIS, Clintrial, Medidata Rave, Central Coding, OpenClinica Open Source and Oracle Clinical.

Role of Project Management and the Project Manager in Clinical Data Management

 

The Project Manager is responsible for the development, oversight of implementation, and communication of clinical research studies.

So what is a Project?

A project is a work effort with a definite beginning and end, an identifiable end result (deliverable), and usually has limits on resources, costs and/or schedule.

What is Project Management?

The application of knowledge, skills, tools, and techniques to project tasks in order to meet project requirements.

In order to be a successful project manager, you need to understand the “Tripple Constraint” and how they affect your project. Let’s look up the WBS-edit checks:

Note: I will refer a project = clinical study

Scope: What is in the contract? How many edit checks, SAS checks and manual checks are required in this study? What is the effort per edit check, SAS check and manual check?

The goal is to convert the idea of data management to that of statistical analysis – an analyzable database.

Time: What are the deliverables and timelines? What resources are needed?

Cost: What are the budget restrictions? Are there any risks associated with any changes?

Project Planning: During the planning of a clinical study, we identify the project scope, develop the project management plan and we identify and schedule the clinical study activities.

Some questions might arise during the project planning phase: how many sites/subjects and pages will be collected?Who will attend team meetings? what study fields will be code (i.e. Adverse Event term)?

Other important activities that the project manager and clinical team members will need to be involved:

Work Break Down (WBS) – it is the list of activities that will be performed during the course of a clinical study.

Resourcing – it is important to assign the right person to a particular task based on skills, education and experience.

ICH Guidelines ‘…all personnel involved in clinical trials must be qualified and properly trained to perform their respective tasks…’

Estimating Cost – look at historical data as well as good estimates from effort per unit and units using your WBS as references.

Scheduling and Budgeting – you will be able to build schedules and budgets that transform project constraints into project success after you successfully construct your Work Breakdown Structures (WBS) and network diagrams and estimate task durations.

Projects managers used techniques for employed to establish project. Project Manager can decide which activity can be delayed without affecting the duration of the projects. They help improving quality and reduce the risks and costs related with the projects.

A recent survey by the Project Management Institute provided 10 challenges affecting project managers. This research intended to identify key factors affecting project team performance:

  1. Changes to Project Scope (Scope Creep)
  2. Resources are Inadequate (Excluding Funding)
  3. Insufficient Time to Complete the Project
  4. Critical Requirements are Unspecified or Missing
  5. Inadequate Project Testing
  6. Critical Project Tasks are Delivered Late
  7. Key Team Members Lack Adequate Authority
  8. The Project Sponsor is Unavailable to Approve Strategic Decisions
  9. Insufficient Project Funding
  10. Key Team Members Lack Critical Skills

Another question to ask is what tools are available to help you get the job done?

  1. Resource allocation (and the software’s ability to easily display staff who were overallocated)
  2. Web-based/SaaS option
  3. Cost/Price of the system (big one!)
  4. Contractual terms we could enter into (i.e. 6 months, 12 months, month to month)
  5. Ability to demo the software and for how long
  6. What sort of customizations could be made to the software after purchase
  7. Types of customers the software has served
  8. Report types
  9. Ability to sync with accounting software and which ones, if so
  10. Timeline generation capabilities and import function with MS Project
  11. Ability to create template projects
  12. Ability to alert on early warning signs (i.e. budget overruns over 10%)

It is suggestted that you review each suggestion on project management tool very, very carefully to determine how it fits your processes.

Your organization’s processes are unique to your organization; no other organization anywhere has quite the same processes. So what may work for one organization may not necessarily work for you. Your organization developed its processes to suit your particular corporate culture, the particular collective character attributes of the employees (their experience, etc.), the type of projects that you execute and the particular types customers/clients that you have (especially the regular ones).

You now have to make sure that the tools you choose work for you and your particular processes. Do not change your processes again to suit whatever workflow (process) is dictated by the fancy tool that the fancy salesman sold to you; you are likely to find that the tool-dictated workflows do not work that well in your organization, with the result that the employees will give up following processes and/or give up using the tool, throwing everything into chaos again.

Be careful if you are looking at tools that offer to do a number of different functions or can be made to do any function you want it to do. They seldom do the job that you bought it for particularly well. For example, I have worked with a tool that was advertised as a combination issue tracking and defect/bug tracking tool. It was used as a defect tracking tool but it was very poor; it was tremendously difficult to make it prepare useful reports. A hand-written tool set up in a spreadsheet (e.g. Microsoft Excel) or database (e.g. Microsoft Access) would have worked better.

That said, there are tools out there that are specific to one particular function but do offer flexible workflows – they may be modified to match whatever processes your organization already follows.

If your organization has just started to organize the PM processes and PMO that would mean processes & other related areas are not explicitly defined. So there may be a huge risk trying to adopt an integrated and centralized project management system. It is more likely to offer you a very comprehensive, complex but expensive solution wherein your problem is still not defined completely. In such a case you are just not ready with the environment and process maturity that an integrated tool requires prior to implementation.

A more efficient approach should be iterative, incremental and adaptive in nature. That means you shall use simple, not so expensive tools with limited scope to begin with; they can be tools with basic functionalities of WBS, scheduling, traceability and custom datasheets. These tools should have capability to exchange data both ways with more commonly uses tools like MS Excel, MS Project, and Word etc. The processes are likely to mature over time and we will then know the real effectiveness of these basic tools in the context of company requirements. That may be the time to analyze and switch to more integrated solutions.

One important key to remember. The role of project management in clinical trials is evolving. There is a debate about who should be the ‘project manager’ for a particular clinical study. CRA or Clinical Data Manager or an independent project manager? Let’s review their roles within data management.

Clinical Research Associate (CRA): main function is to monitor clinical trials. He or she may work directly with the sponsor company of a clinical trial, as an independent freelancer or for a Contract Research Organization (CRO). A clinical research associate ensures compliance with the clinical trial protocol, checks clinical site activities, makes on-site visits, reviews Case Report Forms (CRFs) and communicates with clinical research investigators. A clinical research associate is usually required to possess an academic degree in Life Sciences and needs to have a good knowledge of Good clinical practice and local regulations. In the United States, the rules are codified in Title 21 of the Code of Federal Regulations. In the European Union these guidelines are part of EudraLex. In India he / she requires knowledge about schedule Y amendments in drug and cosmetic act 1945.

Clinical Data Manager (CDM): plays a key role in the setup and conduct of a clinical trial. The data collected during a clinical trial will form the basis of subsequent safety and efficacy analysis which in turn drive decision-making on product development in the pharmaceutical industry. The Clinical Data Manager will be involved in early discussions about data collection options and will then oversee development of data collection tools based on the clinical trial protocol. Once subject enrollment begins the Clinical Data Manager will ensure that data is collected, validated, complete and consistent. The Clinical Data Manager will liaise with other data providers (eg a central laboratory processing blood samples collected) and ensure that such data is transmitted securely and is consistent with other data collected in the clinical trial. At the completion of the clinical trial the Clinical Data Manager will ensure that all data expected to be captured has been accounted for and that all data management activities are complete. At this stage the data will be declared final (terminology varies but common descriptions are Database Lock and Database Freeze) and the Clinical Data Manager will transfer data for statistical analysis.

Clinical Data Management (CDMS) Tools: (we will review each of them on a separate discussion)

  • Standard Operating Procedures (SOPs)
  • The Data Management Plan (DMP)
  • Case Report Form Design (CRF)
  • Database Design and Build (DDB)
  • Validation Rules also known as edit checks
  • User Acceptance Testing (UAT)
  • Data Entry (DE)
  • Data Validation (DV)
  • Data Queries (DQ)
  • Central Laboratory Data (CLD)
  • Other External Data
  • Serious Adverse Event Reconciliation (SAE)
  • Patient Recorded Data (PRO)
  • Database finalization and Extraction
  • Metrics and Tracking – see BioClinica article on Metrics
  • Quality Control (QC)- see discussion on A QC Plan for A Quality Clinical Database

In conclusion, a key component of a successful clinical study is delivering the project rapidly and cost effectively. Project managers must balance resources, budget and schedule constraints, and ever-increasing sponsor expectations.

Source:

To hire me for services, you may contact me via Contact Me OR Join me on LinkedIn
Anayansi Gamboa has an extensive background in clinical data management as well as experience with different EDC systems including Oracle InForm, InForm Architect, Central Designer, CIS, Clintrial, Medidata Rave, Central Coding, OpenClinica Open Source and Oracle Clinical.

 

Star Spangled Banner – Best Performances

I love to hear new versions of our national Anthem…It gives me chills. Not everyone can pull it off and these are my favorites….

My Favorite Female artists…Unique voice…

O’er the land of the free and the home of the brave!

Fair Use Notice: Images/logos/graphics on this page contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law).

My Favorites Country Songs – Part I

This list is long….Part II in some no distant future..Enjoy!

Fair Use Notice: Images/logos/graphics on this page contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law).

Read All About It

Emeli Sandé – Read All About It

systemchange_noclimatechange

 

You’ve got the words to change a nation
But you’re biting your tongue
You’ve spent a life time stuck in silence
Afraid you’ll say something wrong
If no one ever hears it how we gonna learn your song?
So come on, come on
Come on, come on
You’ve got a heart as loud as lightning
So why let your voice be tamed?
Maybe we’re a little different
There’s no need to be ashamed
You’ve got the light to fight the shadows
So stop hiding it away
Come on, come on
I wanna sing, I wanna shout
I wanna scream ’til the words dry out
So put it in all of the papers,
I’m not afraid
They can read all about it
Read all about it, oh
Oh oh oh
Oh oh oh
Oh oh oh
Oh oh oh
Oh oh oh
Oh oh oh
At night we’re waking up the neighbors
While we sing away the blues
Making sure that we’re remembered, yeah
‘Cause we all matter too
If the truth has been forbidden
Then we’re breaking all the rules
So come on, come on
Come on, come on,
Let’s get the TV and the radio
To play our tune again
It’s ’bout time we got some airplay of our version of events
There’s no need to be afraid
I will sing with you my friend
Come on, come on
I wanna sing, I wanna shout
I wanna scream ’til the words dry out
So put it in all of the papers,
I’m not afraid
They can read all about it
Read all about it, oh
Oh oh oh
Oh oh oh
Oh oh oh
Oh oh oh
Oh oh oh
Oh oh oh
Yeah, we’re all wonderful, wonderful people
So when did we all get so fearful?
Now we’re finally finding our voices
So take a chance, come help me sing this
Yeah, we’re all wonderful, wonderful people
So when did we all get so fearful?
And now we’re finally finding our voices
Just take a chance, come help me sing this
I wanna sing, I wanna shout
I wanna scream ’til the words dry out
So put it in all of the papers,
I’m not afraid
They can read all about it
Read all about it, oh
Oh oh oh
Oh oh oh
Oh oh oh
Oh oh oh
Oh oh oh
Oh oh oh
I wanna sing, I wanna shout
I wanna scream ’til the words dry out
So put it in all of the papers,
I’m not afraid
They can read all about it
Read all about it, oh

 

System Change
System Change

 

 

Fair Use Notice: Images/logos/graphics on this page contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law).

Say Something

A Great Big World, Christina Aguilera

Say Something

 

 

Fair Use Notice: Images/logos/graphics on this page contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law). Yvonne Koelemeijer