Category Archives: Programming

Freelancer / Consultant / EDC Developer / Clinical Programmer

* Setting up a project in EDC (Oracle InForm, Medidata Rave, OpenClinica, OCRDC)
* Creation of electronic case report forms (eCRFs)
* Validation of programs, edit checks
* Write validation test scripts
* Execute validation test scripts
* Write custom functions
* Implement study build best practices
* Knowledge of the process of clinical trials and the CDISC data structure

 

Advertisements

How to Avoid Electronic Data Integrity Issues: 7 Techniques for your Next Validation Project

The idea of this article was taking (with permission from the original authors) from Montrium:  how-to-avoid-electronic-data-integrity-issues-7-techniques-for-your-next-validation-project

Regulatory agencies around the globe are causing life science companies to be increasingly concerned with data integrity.  This comes with no surprise given that Guidance Documents for Data Integrity have been published by the MHRAFDA (draft), and WHO (draft).  In fact, the recent rise in awareness of the topic has been so tremendous that, less than two years after the original publication, the MHRA released a new draft of its guidance whose scope has been broadened from GMP to all GxP data.

Is data integrity an issue of good documentation practices? You can read GCP information about this topic here.

Good Documentation Practices for SAS / EDC Developers

Are you practising GCP?

In computerised systems, failures in data integrity management can arise from poor or complete lack of system controls.  Human error or lack of awareness may also cause data integrity issues.  Deficiencies in data integrity management are crucial because they may lead to issues with product quality and/or patient safety and, ultimately may manifest themselves through patient injury or even death.

I recently was at the vendor qualification tool that uses a hand held device to read data while the physician or expert manually put pressure on someone’s body parts (e..g. pain related). I was not impressed. Even though it seems like a nice device with its own software, the entire process was manual and therefore, questionable data integrity. The measurement seems to be all over the place and you would need the right personnel at the clinical site to perform a more accurate reading since again, it was all manual and dependent of someone else used of the device.

I also questioned the calibration of this device. The sale’s person answer ? “Well, it is reading 0 and therefore, it is calibrated.”….Really? You mean to tell me you have no way of proving when you perform calibration? Where is the paper trail proving your device is accurate? You mean to tell me I have to truth your words? Or your device’s screen that reads ‘0’? Well, I have news for you. Tell that to the regulators when they audit the trial.

What is Data Integrity?

Data can be defined as any original and true copy of paper or electronic records.  In the broadest sense, data integrity refers to the extent to which data are complete, consistent and accurate.

To have integrity and to meet regulatory expectations, data must at least meet the ALCOA criteria. Data that is ALCOA-plus is even better.

Alcoa

 

What is a Computerised System?

computerised system is not only the set of hardware and software, but also includes the people and documentation (including user guides and operating procedures) that are used to accomplish a set of specific functions.  It is a regulatory expectation that computer hardware and software are qualified, while the complete computerised system is validated to demonstrate that it is fit for its intended use.

How can you demonstrate Electronic Data Integrity through Validation?

Here are some techniques to assist you in ensuring the reliability of GxP data generated and maintained in computerised systems.

Specifications

What to do

Why you should do this

Outline your expectations for data integrity within a requirements specification.

For example:

  • Define requirements for the data review processes.
  • Define requirements for data retention (retention period and data format).
Validation is meant to demonstrate a system’s fitness for intended use.  If you define requirements for data integrity, you will be more inclined to verify that both system and procedural controls for data integrity are in place.
Verify that the system has adequate technical controls to prevent unauthorised changes to the configuration settings.

For example:

  • Define the system configuration parameter within a configuration specification.
  • Verify that the system configuration is “locked” to end-users.  Only authorized administrators should have access to the areas of the system where configuration changes can be made.
The inspection agencies expect you to be able to reconstruct any of the activities resulting in the generation of a given raw data set.  A static system configuration is key to being able to do this.

 

Verification of Procedural Controls

What to do

Why you should do this

Confirm that procedures are in place to oversee the creation of user accounts.

For example:

  • Confirm that user accounts are uniquely tied to specific individuals.
  • Confirm that generic system administrator accounts have been disabled.
  • Confirm that user accounts can be disabled.
Shared logins or generic user accounts should not be used since these would render data non-attributable to individuals.

System administrator privileges (allowing activities such as data deletion or system configuration changes) should be assigned to unique named accounts.  Individuals with administrator access should log in under his named account that allows audit trails to be attributed to that specific individual.

Confirm that procedures are in place to oversee user access management.

For example:

  • Verify that a security matrix is maintained, listing the individuals authorized to access the system and with what privileges.
A security matrix is a visual tool for reviewing and evaluating whether appropriate permissions are assigned to an individual. The risk of tampering with data is reduced if users are restricted to areas of the system that solely allow them to perform their job functions.
Confirm that procedures are in place to oversee training.

For example:

  • Ensure that only qualified users are granted access to the system.
People make up the part of the system that is most prone to error (intentional or not).  Untrained or unqualified users may use the system incorrectly, leading to the generation of inaccurate data or even rendering the system inoperable.

Procedures can be implemented to instruct people on the correct usage of the system.  If followed, procedures can minimize data integrity issues caused by human error. Individuals should also be sensitized to the consequences and potential harm that could arise from data integrity issues resulting from system misuse.

Logical security procedures may outline controls (such as password policies) and codes of conduct (such as prohibition of password sharing) that contribute to maintaining data integrity.

 

Testing of Technical Controls

What to do

Why you should do this

Verify calculations performed on GxP data.

For example:

  • Devise a test scenario where input data is manipulated and double-check that the calculated output is exact.
When calculations are part of the system’s intended use, they must be verified to ensure that they produce accurate results.
Verify the system is capable of generating audit trails for GxP records.

For example:

  • Devise a test scenario where data is created, modified, and deleted.  Verify each action is captured in a computer-generated audit trail.
  • Verify the audit trail includes the identity of the user performing the action on the record
  • Verify the audit trail includes a time stamp
  • Verify the system time zone settings and synchronisation.
With the intent of minimizing the falsification of data, GxP record-keeping practices prevent data from being lost or obscured.  Audit trails capture who, when and why a record was created, modified or deleted.  The record’s chronology allows for reconstruction of the course of events related to the record.

The content of the audit trails ensures that data is always attributable and contemporaneous.

For data and the corresponding audit trails to be contemporaneous, system time settings must be accurate.

 

 

 

Who can delete data?

Adequately validated and have sufficient controls to
prevent unauthorized access or changes to data.

Implement a data integrity lifecycle concept:

  • Activate audit trail and its backup
  • Backup and archiving processes
  • Disaster recovery plan
  • Verification of restoration of raw data
  • Security, user access and role privileges (Admin)

Warning Signs – Red Flags

  • Design and configuration of systems are poor
  • Data review limited to printed records – no review
    of e-source data
  • System administrators during QC, can delete data (no proper documentation)
  • Shared Identity/Passwords
  • Lack of culture of quality
  • Poor documentation practices
  • Old computerized systems not complying with part 11 or Annex 11
  • Lack of audit trail and data reviews
  • Is QA oversight lacking? Symptom of weak QMS?
I love being audited

 

 

 

 

 

 

Perform Self Audits

  • Focus on raw data handling & data review/verification
  • Consider external support to avoid bias
  • Verify the expected sequence of activities: dates,
    times, quantities, identifiers (such as batch,
    sample or equipment numbers) and signatures
  • Constantly double check and cross reference
  • Verify signatures against a master signature list
  • Check source of materials received
  • Review batch record for inconsistencies
  • Interview staff not the managers

FDA 483 observations

“…over-writing electronic raw data…..”

“…OOS not investigated as required by SOP….”

“….records are not completed contemporaneously”

“… back-dating….”

“… fabricating data…”

“…. No saving electronic or hard copy data…”

“…results failing specifications are retested until
acceptable results are obtained….”

  • No traceability of reported data to source documents

Conclusion:

Even though we try to comply with regulations (regulatory expectations from different agencies e.g. EMA, MHRA, FDA, etc), data integrity is not always easy to detect. It is important the staff working in a regulated environment be properly trained and continuous refresher provided through their career (awareness training of new regulations and updates to regulations).

Companies should also integrate a self-audit program and develop a strong quality culture by implementing lesson learned from audits.

Sources:

You can read more about data integrity findings by searching the followng topics:

MHRA GMP Data Integrity Definitions & Guidance for the Industry,
MHRA DI blogs: org behaviour, ALCOA principles
FDA Warning Letters and Import Alerts
EUDRA GMDP database noncompliance

The Mind-Numbing Way FDA Uncovers Data
Integrity Laps”, Gold Sheet, 30 January 2015

Data Integrity Pitfalls – Expectations and Experiences

Fair Use Notice: Images/logos/graphics on this page contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law)

How to document the testing done on the edit checks?

Since the introduction of the Electronic Data Capture (EDC) in clinical trials where data is entered directly into the electronic system, it is estimated that the errors (e.g. transcription error) have been reduced by 70% [ Clinical Data Interchange Standards Consortium – Electronic Source Data Interchange 2005].

The Data Management Plan (DMP) defines the validation test to be performed to ensure data entered into the clinical database is complete, correct, allowable, valid and consistent.

Within the DMP, we find the Data Validation Plan. Some companies call it ‘DVS’ others ‘DVP’.  The Good practices for computerized systems in regulated GxP environments defines validation as a system that assures the formal assessment and reporting of quality and performance measures for all the life-cycle stages of software and system development, its implementation, qualification and acceptance, operation, modification, qualification, maintenance, and retirement.

As an {EDC} Developer or Clinical Programmer, you will be asked to:

  • Develop test scripts and execution logs for User Acceptance Testing (UAT).
  • Coordinate of UAT of eCRF build with clinical ops team members and data management and validating documents, included but not limited to: edit check document, issue logs, UAT summary report and preparation and testing of test cases.

Remember not every EDC system is alike. Some systems allow you to perform testing on the edit checks programmed; others allow you to enter test data on a separate instance than production (PROD).

Data Validation and UAT Module.png

For example, some EDC systems facilitate re-usability:

  1. There is a built-in test section for each study – where data can be entered and are stored completely separate from production data. This allows you to keep the test data for as long as needed to serve as proof of testing.
  2. The copy function allows for a library of existing checks (together with their associated CRF pages) to be copied into a new study. If there are no changes to the standard checks or pages then reference can be made back to the original set of test data in a standards study, thus reducing the study level overhead.
  3. The fact that many of the required checks (missing data, range checks, partial dates etc.) do not require the programming of an edit check at all. Each of these and many others are already there as part of the question definition itself and therefore do not need any additional testing or documentation for each study.

If you have not documented, you have not done it-FDA

The “ideal world” scenario would be to reduce the actual edit check testing by the system generating a more “human readable” format of the edit checks. The testers that way would not have to test each boundary conditions of the edit checks once the system is validated. All they would have to do is inspect the “human readable” edit checks vs the alerts and would also be easy for the clients to read and sign off.

You can leverage the EDC systems audit trail under certain conditions. First of all – the system you are testing with must be validated in itself. Some EDC products are only ‘validated’ once a study is built on top of them – they are effectively further developed as part of a study implementation process – in this situation, I would doubt you could safely use the audit trail.

Secondly, you need to come up with a mechanism whereby you can assure that each edit check has been specifically tested – traceability.

Finally, you need to secure the test evidence. The test data inside the EDC tool must be retained for as long as the archive as part of the evidence of testing.

The worst methods in my view are paper / screenshot based. They take too long, and are largely non-reusable. My past experience has been creating test cases using MS Word then performing each step as per test case and take a screenshot, where indicated. Then attached to the final documentation and validation summary. This obviously a manual and tedious process. Some companies create test cases using HPQC or similar tool. This is a bit more automated and traceable yet, it is still prone for errors. It is better than documenting using MS Word or Excel but it is still a manual process.

Re-usability is what it is all about, but, you need to ensure you have methods for assuring the test evidence produced for edit checks you are reusing is usable as part of the re-use exercise.

Edit Check Design, Development and Testing is the largest part of any typical EDC implementation. Applying methods to maximize quality and minimize time spent is one of the areas I have spent considerable time on over the last couple of years.

For additional tips on writing effective edit checks please go here -Effective edit checks eCRFs.

To hire me for services, you may contact me via Contact Me OR Join me on LinkedIn

Source images: provided courtesy of Google images.

-FAIR USE-
“Copyright Disclaimer Under Section 107 of the Copyright Act 1976, allowance is made for “fair use” for purposes such as criticism, comment, news reporting, teaching, scholarship, and research. Fair use is a use permitted by copyright statute that might otherwise be infringing. Non-profit, educational or personal use tips the balance in favor of fair use.”

Data Entry|Programador | Diseño Gráfico | SEO Servicios para su empresa

¿Usted está luchando trabajar con independientes desarrolladores / diseñadores y en busca de una solución fiable aún rentable. Dejamos que usted emplea empleados virtuales contractuales para cualquier servicio relacionado TI (computación), como software / web / aplicación móvil / diseño gráfico o la entrada de datos en tiempo completo / tiempo parcial / base de proyecto.

Disponibilidad- actual de recursos

1- Diseñador Gráfico – EXP 2-5 Años – ($ 749- $ 1499) [disponibilidad-2]

2- Diseñador HTML -Exp- 3 Years- $ 1349 [disponibilidad-2]

3- PHP o la red del punto Developer- EXP 1-3 Años – ($ 999- $ 1999 [disponibilidad-4]

4- Mobile App Developer: $ 1500 [disponibilidad-2]

5- Contenido escritor-$ 999 [disponibilidad-3]

6- SEO / pago por clic de Google experto- $ 999 [disponibilidad-4] Entrada de Datos

7- (búsqueda de datos, Gestión de la lista, CMS entradas- $ 799 También disponible, Desarrolladores/programadores con mas experiencia (MS SQL o Oracle, .NET, PL/SQL), administradores de proyectos para diversas tecnologías! Las principales características son: • contrato de mes a mes • escalar fácilmente y rápidamente hacia arriba / abajo

Responder a este correo electrónico con sus requisitos / consultas y estaré feliz de compartir nuestro perfil de empresa or Click aqui!

Case Study 2: Supporting the Sponsor with Database Transfer Solution

Assisting the Sponsor with Database Transfer Solution

Anayansi gamboa - Data Management Support @RAeClinica

 

 
The Scenario:

The Sponsor required a large safety and data management team to assist for a submission deadline requiring the transfer of data to a new safety database for clinical trials and post-marketed products.
RA eClinica Solution:

    • RA eClinica responsible for AE/SAE reporting, safety coding, NDA submission support
    • RA eClinica collaborated with the Sponsor’s safety team to develop a functional safety alliance consisting of over 10+ team members inclusive of management, safety and data management resources
    • Ra eClinica team is responsible for managing over 5+ compounds

Ra eClinica Results:

    • RA eClinica project team exceeded the timelines, completing the tasks approximately 30-days ahead of schedule
    • RA eClinica management collaborate with the Sponsor to redefine operational workflow and processes in order to increase efficiencies across several departments (Quality Control, Pharmacovigilance)

RA eClinica is a established consultancy company for all essential aspects of statistics, clinical data management and EDC solutions. Our services are targeted to clients in the pharmaceutical and biotech sector, health insurers and medical devices.

The company is headquarter in Panama City and representation offices with business partners in the United States, India and the European Union. For discussion about our services and how you can benefit from our SMEs and cost-effective implementation CDISC SDTM clinical data click here.

The Only Three (3) [Programming] Languages You Should Learn Right Now (eClinical Speaking)

On a previous article that I wrote in 2012, I mentioned 4 programming languages that you should be learning when it comes to the development of clinical trials. Why is this important, you may ask? Clinical Trials is a method to determine if a new drug or treatment will work on disease or will it be beneficial to patients. Anayansi Gamboa - Clinical Data Management Process If you have never written a line of code in your life, you are in the right place. If you have some programming experience, but interesting in learning clinical programming, this information can be helpful.

But shouldn’t I be Learning ________?

Here are the latest eClinical programming languages you should learn:

1. SAS®: Data analysis and result reporting are two major tasks to SAS® programers. Currently, SAS is offering certifications as a Clinical Trials Programmer. Some of the skills you should learned are:

  • clinical trials process
  • accessing, managing, and transforming clinical trials data
  • statistical procedures and macro programming
  • reporting clinical trials results
  • validating clinical trial data reporting

2. ODM/XML: Operational Data Modeling or ODM uses XML to build the standard data exchange models that are being developed to support the data acquisition, exchange and archiving of operational data.

3. CDISC Language: Yes. This is not just any code. This is the standard language on clinical trials and you should be learning it right now. The future is here now. The EDC code as we know it will eventually go away as more and more vendors try to adapt their systems and technologies to meet rules and regulations. Some of the skills you should learn:

  • Annotation of variables and variable values – SDTM aCRF
  • Define XML – CDISC SDTM datasets
  • ADaM datasets – CDISC ADaM datasets

CDISC has established data standards to speed-up data review and FDA is now suggesting that soon this will become the norm. Pharmaceuticals, bio-technologies companies and many sponsors within clinical research are now better equipped to improve CDISC implementation.

Everyone should learn to code

Therefore, SAS® and XML are now cooperating. XML Engine in SAS® v9.0 is built up so one can import a wide variety of XML documentation. SAS® does what is does best – statistics, and XML does what it does best – creating reportquality tables by taking advantage of the full feature set of the publishing software. This conversation can produce report-quality tables in an automated hands-off/light out process.

Standards are more than just CDISC

If you are looking for your next career in Clinical Data Management, then SAS and CDISC SDTM should land you into the right path of career development and job security.

Conclusion: Learn the basics and advanced SAS clinical programming concepts such as reading and manipulating clinical data. Using the clinical features and basic SAS programming concepts of clinical trials, you will be able to import ADAM, CDISC or other standards for domain structure and contents into the metadata, build clinical domain target table metadata from those standards, create jobs to load clinical domains, validate the structure and content of the clinical domains based on the standards, and to generate CDISC standard define.xml files that describes the domain tables for clinical submissions.

Anayansi Gamboa has an extensive background in clinical data management as well as experience with different EDC systems including Oracle InForm, InForm Architect, Central Designer, CIS, Clintrial, Medidata Rave, Central Coding, OpenClinica – Open Source and Oracle Clinical.

Disclaimer: The legal entity on this blog is registered as Doing Business As (DBA) – Trade Name – Fictitious Name – Assumed Name as “GAMBOA”.

Source:

SAS Institute
CDISC

Randomisation

Comments? Join us at {EDC Developer}

Anayansi Gamboa, MPM, an EDC Developer Consultant and clinical programmer for the Pharmaceutical and Biotech industry with more than 13 years of experience.

Available for short-term contracts or ad-hoc requests. See my specialties section (Oracle, SQL Server, EDC Inform, EDC Rave, OpenClinica, SAS and other CDM tools)

As the 3 C’s of life states: Choices, Chances and Changes- you must make a choice to take a chance or your life will never change. I continually seek to implement means of improving processes to reduce cycle time and decrease work effort.

Subscribe to my blog’s RSS feed and email newsletter to get immediate updates on latest news, articles, and tips. I am available on LinkedIn. Connect with me there for technical discussions.

Fair Use Notice: This article/video contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law. If you wish to use this copyrighted material for purposes that go beyond fair use, you must obtain permission from the copyright owner. Fair Use notwithstanding we will immediately comply with any copyright owner who wants their material removed or modified, wants us to link to their website or wants us to add their photo.

Project Plan: CDISC Implementation

CDISC standards have been in development for many years. There are now methodologies and technologies that would make the transformation of non-standard data into CDISC-compliance with ease. Clinical trials have evolved and become more complex and this requires a new set of skills outside of clinical research – Project Management.

As with many projects, CDISC is a huge undertake. It requires resources, technology and knowledge-transfer. The industry (FDA for example) has been working on standardization for years but on September 2013, it became official, in which the FDA released a ‘Position Statement‘.

So what is CDISC? We can say that it is way of naming convention for XPT files, or field names naming conventions or rules for handling unusual data. Currently, there are two main components of CDISC: SDTM (Study Data Tabulation Model) and aDAM (Analysis Data Model).

As a project manager and with the right tool, you can look to a single source project information to manage the project through its life-cycle – from planning, through execution, to completion.

1) Define Scope: This is where you’re tested on everything that has to do with getting a project up and running: what’s in the charter, developing the preliminary scope, understanding what your stakeholders need, and how your organization handles projects.

The scope document is a form of a requirement document which will help you identify the goals for this project. It can also be used as a communication method to other managers and team members to set the appropriate level of expectations.

The project scope management plan is a really important tool in your project. You need to make sure that what you’re delivering matches what you wrote down in the scope statement.

2) Define Tasks: we now need to document all the tasks that are required in implementing and transforming your data to CDISC.

Project Tasks  (Work packages) Estimates (work unit)
Initial data standards review 27
Data Integrity review 17
Create transformation models 35

The work breakdown structure (wbs) provides the foundation for defining work as it relates to project objectives. The scope of work in terms of deliverables and to facilitate communication between the project manager and stakeholders throughout the life of the project. Hence, even though, preliminary at first, it is a key input to other project management processes and deliverables.

3) Project Plan: Once we completed the initiation phase (preliminary estimates), we need to create a project plan assigning resources to project and schedule those tasks. Project schedules can be presented in many ways, including simple lists, bar charts with dates, and network logic diagrams with dates, to name just a few. A sample of the project plan is shown below:

project plan sample
image from Meta‐Xceed paper about CDISC

4) Validation Step: Remember 21 CFR Part 11 compliance for Computer Systems Validation? The risk management effort is not a one-time activity on the project. Uncertainty is directly associated with the change being produced by a project. The following lists some of the tasks that are performed as it pertains to validation.

  • Risk Assessment: Different organizations have different approaches towards validation of programs. This is partly due to varying interpretations of the regulations and also  due to how different managers and organizations function. Assess the level of validation that needs to take place.
  • Test Plan: In accordance with the project plan and, if not, to determine how to address any deviation. Test planning is essential in:  ensuring testing identifies and reveals as many errors as possible and to acceptable levels of quality.

test plan-cdisc

  • Summary Results: This is all the findings documented during testing.

An effective risk management process involves first identifying and defining risk factors that could affect the various stages of the CDISC implementation process as well as specific aspects of the project. riskplan

5) Transformation Specification: Dataset transformation is a process in which a set of source datasets and its variables are changed to  meet new standard requirements. Some changes will occur during this step: For example, variable name must be 8 chars long. The variable label must not be more than 40 chars in length. Combining values from multiple sources (datasets) into on variable.

6) Applying Transformation: This is done according to specification however, this document is active during the duration of a project and can change. There are now many tools available to help with this tasks as it could be time consuming and resource intensive to update the source code (SAS) manually. Transdata, CDISCXpres, SAS CDIDefine-it; just to name a few.

7) Verification Reports: The validation test plan will detail the specific test cases that need to be implemented  to ensure quality of the transformation. For example, a common report is the “Duplicate Variable” report.

8) Special Purpose Domain: CDISC has several special purpose domains: CO (comments), RELREC (related records or relationship between two datasets) and SUPPQUAL (supplemental qualifiers for non-standards variables).

9) Data Definition Documentation: In order to understand what all the variables are and how they are derived, we need a annotation document. This is the document that will be included during data submission. SAS PROC CONTENTS can help in the generation of this type of metadata documentation. The last step in the project plan for CDISC implementation is to generate the documentation in either PDF  or XML format.

CDISC has established data standards to speed-up data review and FDA is now suggesting that soon this will become the norm. Pharmaceuticals, bio-technologies companies and many sponsors within clinical research are now better equipped to improve CDISC implementation.

Need SAS programmers? RA eClinica can help provide resources in-house / off-shore to facilitate FDA review by supporting CDISC mapping, SDTM validation tool, data conversion and CDASH compliant eCRFs.

Disclaimer: The legal entity on this blog is registered as Doing Business As (DBA) – Trade Name – Fictitious Name – Assumed Name as “GAMBOA”.

Your Strongest Characteristics

The reason I am writing is that I am looking for a new challenge and I felt we might have some areas for discussion, or at least you might be able to point me in the right direction.

My services basically represent a low-cost alternative to the traditional consultancy companies, which usually demand high prices for their services. My services are always the right partner for you – no matter if you are looking for an interim solution due to personnel shortage in your company, if you demand creative and unconventional solutions or if your project requires up-to-date knowledge – direct cost savings.

My current work and skills:

I’m currently working on several initiatives but there is always room for new challenges. I hold a project management degree and several certifications in different fields.

 My style?

I’m a quick-learner, flexible to change, achievement oriented and proactive.

 What you will gain from working with me:

  • Clarity on what you want
  • Tools and approaches for getting it
  • Insight into your patterns that work and roadblocks that don’t
  • An ability to see options and take action, in other words, results.

 What you can expect from me:

Honesty, straightforward yet sensitive advice and confidentiality.

Are these the kinds of skills you look for in a project leader?

Comments? Join us at {EDC Developer}

Anayansi Gamboa, MPM, an EDC Developer Consultant and clinical programmer for the Pharmaceutical and Biotech industry with more than 13 years of experience.

Available for short-term contracts or ad-hoc requests. See my specialties section (Oracle, SQL Server, EDC Inform, EDC Rave, OpenClinica, SAS and other CDM tools)

As the 3 C’s of life states: Choices, Chances and Changes- you must make a choice to take a chance or your life will never change. I continually seek to implement means of improving processes to reduce cycle time and decrease work effort.

Subscribe to my blog’s RSS feed and email newsletter to get immediate updates on latest news, articles, and tips. I am available on LinkedIn. Connect with me there for technical discussions.

Data verification puzzles

 

Important part of the data management job is to verify received data. Checking for inconsistencies and unexpected patterns. Verifying that the data is complete, legible, logical and plausible.

However, how to perform data verification?

You could regard the data verification job as completing a couple of puzzles. Each puzzle is one subject participating in the clinical trial or clinical study at stake. As such, the puzzles resemble each other a great deal. But they are not exact copies. Each subject, each puzzle, is (slightly) different, unique.

Pleasant and thoughtful team action:

Do you have a puzzle somewhere in a cupboard? More than one from the same series? At least 2 puzzles with > 100 pieces each? Open the boxes, drop their content in one pile on the table and start completing the puzzles/subjects. The more pieces in place of a puzzle, the more evident which pieces to expect.

1. Get the parts received, divide them per subject/puzzle and start making all the puzzles. The clinical information up on each subject is coming in pieces, per completed visit data, per available adverse event information. In the beginning you’ll thus work with lots of incomplete puzzles.

2. Any holes in any puzzle/subject, any missing parts, you need to look for/query. Note that holes are allowed if your puzzle/story is as such! However, leave no unexpected holes. Meaning that if an assessment took place, you want to have the corresponding result(s) completed.

3. Any duplicate pieces, get rid of them. Please query.

4. Any pieces not fitting your puzzle/subject story, you need to check up on. Maybe they belong to another puzzle/subject. Or they are incomplete and can therefore not fit (yet). They could even be wrong delivered and not belong to the study at all.

5. Any pieces fitting but rotated 90 or 180 degrees, please turn/query. Get the puzzle showing a logical story.

6. Any pieces damaged, please try to fix the damaged parts. E.g. spilled coffee over a paper CRF. Illegible text parts. Or unclear texts that can be interpreted differently.

7. Any pieces added at the wrong place, query and bring to their right position. E.g. an error in an assessment date.

In trial/study language, the more data for a subject received and in the database, the easier to get the subject’s story complete. However, the more care needed to get the true story. The logical, plausible subject story. Attention to medication given for an adverse event but missing in the concomitant medication list. Or laboratory shifts to worse results but missing corresponding adverse events listed.

Completing the holes in a puzzle is easy, for data management the edit checks help you tremendously with that. Getting a logical, plausible story for each patient, reflecting the truth, is the real data management challenge. Which takes more than just structuring pieces. It asks you to look and understand the pictures up on the pieces received.

Good luck with your data management puzzles,

Source:

“This is an article of ProCDM. Clinical data management training. Receive tips and the free e-book ‘Five strategies to get reliable, quality clinical data’ by subscribing via http://www.procdm.nl/pages/knowledgebase.asp.”

Comments? Join us at {EDC Developer}

Anayansi Gamboa, MPM, an EDC Developer Consultant and clinical programmer for the Pharmaceutical and Biotech industry with more than 13 years of experience.

Available for short-term contracts or ad-hoc requests. See my specialties section (Oracle, SQL Server, EDC Inform, EDC Rave, OpenClinica, SAS and other CDM tools)

As the 3 C’s of life states: Choices, Chances and Changes- you must make a choice to take a chance or your life will never change. I continually seek to implement means of improving processes to reduce cycle time and decrease work effort.

Subscribe to my blog’s RSS feed and email newsletter to get immediate updates on latest news, articles, and tips. I am available on LinkedIn. Connect with me there for technical discussions.

Fair Use Notice: This article/video contains some copyrighted material whose use has not been authorized by the copyright owners. We believe that this not-for-profit, educational, and/or criticism or commentary use on the Web constitutes a fair use of the copyrighted material (as provided for in section 107 of the US Copyright Law. If you wish to use this copyrighted material for purposes that go beyond fair use, you must obtain permission from the copyright owner. Fair Use notwithstanding we will immediately comply with any copyright owner who wants their material removed or modified, wants us to link to their website or wants us to add their photo.

Disclaimer: The legal entity on this blog is registered as Doing Business As (DBA) – Trade Name – Fictitious Name – Assumed Name as “GAMBOA”.