With Attune’s Business Intelligence Solution for Labs harmonizing your operations data from billing, cash, remittance, procurement to performance, you can now monitor and forecast financial and operational performance more effectively. Mobile dashboards allow your executives to stay in touch with recent developments in the revenue cycle at all times.
Hitting the Sweet Spot with Predictive Analytics (Michael Draugelis)Ashleigh Kades
Speaker Presentation from U.S. News Healthcare of Tomorrow leadership summit, November 2-4, 2016 in Washington, DC. Find out more about this forum at www.usnewshot.com.
Learn about d-Wise's newest offering, Blur! Our De-Identification and Anonymization program to help keep all of your records and user information private.
Jason Bhan, MD
EVP, Chief Medical Officer, Co-founder
Medivo
The Institute’s Unleashing Innovation in Healthcare program is designed to identify and expose innovative technologies and processes to solving many of the complex challenges facing the U.S. healthcare system. This unique 7 minute presentation gives health IT startups the chance to showcase ground-breaking solutions and approaches to advance the effective use of healthcare technology. Areas of emphasis include solutions and processes that can reduce cost, improve quality and demonstrate the efficacy of healthcare technology with a specific focus on Triple Aim drivers.
With Attune’s Business Intelligence Solution for Labs harmonizing your operations data from billing, cash, remittance, procurement to performance, you can now monitor and forecast financial and operational performance more effectively. Mobile dashboards allow your executives to stay in touch with recent developments in the revenue cycle at all times.
Hitting the Sweet Spot with Predictive Analytics (Michael Draugelis)Ashleigh Kades
Speaker Presentation from U.S. News Healthcare of Tomorrow leadership summit, November 2-4, 2016 in Washington, DC. Find out more about this forum at www.usnewshot.com.
Learn about d-Wise's newest offering, Blur! Our De-Identification and Anonymization program to help keep all of your records and user information private.
Jason Bhan, MD
EVP, Chief Medical Officer, Co-founder
Medivo
The Institute’s Unleashing Innovation in Healthcare program is designed to identify and expose innovative technologies and processes to solving many of the complex challenges facing the U.S. healthcare system. This unique 7 minute presentation gives health IT startups the chance to showcase ground-breaking solutions and approaches to advance the effective use of healthcare technology. Areas of emphasis include solutions and processes that can reduce cost, improve quality and demonstrate the efficacy of healthcare technology with a specific focus on Triple Aim drivers.
Big data includes large volumes of data, both unstructured and structured,however the volume of data is not important but the execution is. How organization's perceive those data and implements the understanding, resulting in change- is what matters. HashCash Consultants assists organization's to analyze the data for insights that result in better decisions and strategic business moves.
On this slides, we tried to give an overview of advanced Data quality management (ADQM). To understand about DQ why important, and all those steps of DQ management.
Leverage Big Data Analytics to Enhance Clinical Trials from Planning to Execu...Saama
Nikhil Gopinath, Senior Solutions Engineer for the Life Sciences at Saama, spoke at EyeforPharma's Clinical Trial Innovation Summit event in February 2017. These slides are from his "Leverage Big Data Analytics to Enhance Clinical Trials from Planning to Execution" presentation.
On Tuesday July 26, 2016 I presented at the Tableau 10 launch in front of approximately 1,000 people. The purpose was to explain how we'd been able to leverage Tableau for insights at GenesisCare. This is the presentation slide deck.
In this webinar presentation, d-Wise draws on its deep and unrivaled core expertise enabling life sciences clients to modernize their SAS infrastructure by highlighting key strategies on how to successfully modernize your SAS implementation.
Data Quality Analytics: Understanding what is in your data, before using itDomino Data Lab
Analytics and data science are ever growing fields, as business decision makers continue to use data to drive decisions. The pinnacle of these fields are the models and their accuracy/fit,; what about the data? Is your data clean, and how do you know that? Our discussion will focus on best practices for data preprocessing for analytic uses. Beginning with essential distributional checks of a dataset to a propose method for automated data validation process during ETL for transactional data.
Optimizing a Data Migration with an AssessmentJulie Champagne
Performing an assessment of the legacy system(s) before embarking on a data migration project may not always be a part of the project plans, but an assessment is extremely valuable in refining the scope of the project, as well as setting up the project for success.
Please join us to review how taking the time to perform a thorough assessment of your legacy data may benefit your organization, both in the short-term and long-term, in helping to address the different nuances of multiple systems.
Learn how an assessment may be the right choice for sending your next data migration down the path to success!
A brief tour of why we focused on building out a data warehouse early on at Clover, and why we think the Data Science function has room to grow in health insurance.
On this slides, we tried to give an overview of advanced Data quality management (ADQM). To understand about DQ why important, and all those steps of DQ management.
Big Data Analytics for Healthcare Decision Support- Operational and ClinicalAdrish Sannyasi
Splunk’s data analytics platform could be utilized to solve many high impact business problems in healthcare delivery systems to reduce cost, improve patient outcome and safety, and enhance care coordination experience. Analyze observed behavior from healthcare event data and metadata to discover patterns, monitor compliance, and optimize the workflow. Furthermore 80% of healthcare data is unstructured (clinical free text and documentation), or semi-structured and many new data sources are such as tele health, mobile health, sensors, and devices are getting integrated in many healthcare systems specifically in the area of chronic disease management. So, one need analytics software that can harvest, interpret, enrich, normalize, and model diverse structured and unstructured data and analytics approaches that embrace the “data turmoil” by relying less on standardized data items and more on the capability to process data in any format.
Big data includes large volumes of data, both unstructured and structured,however the volume of data is not important but the execution is. How organization's perceive those data and implements the understanding, resulting in change- is what matters. HashCash Consultants assists organization's to analyze the data for insights that result in better decisions and strategic business moves.
On this slides, we tried to give an overview of advanced Data quality management (ADQM). To understand about DQ why important, and all those steps of DQ management.
Leverage Big Data Analytics to Enhance Clinical Trials from Planning to Execu...Saama
Nikhil Gopinath, Senior Solutions Engineer for the Life Sciences at Saama, spoke at EyeforPharma's Clinical Trial Innovation Summit event in February 2017. These slides are from his "Leverage Big Data Analytics to Enhance Clinical Trials from Planning to Execution" presentation.
On Tuesday July 26, 2016 I presented at the Tableau 10 launch in front of approximately 1,000 people. The purpose was to explain how we'd been able to leverage Tableau for insights at GenesisCare. This is the presentation slide deck.
In this webinar presentation, d-Wise draws on its deep and unrivaled core expertise enabling life sciences clients to modernize their SAS infrastructure by highlighting key strategies on how to successfully modernize your SAS implementation.
Data Quality Analytics: Understanding what is in your data, before using itDomino Data Lab
Analytics and data science are ever growing fields, as business decision makers continue to use data to drive decisions. The pinnacle of these fields are the models and their accuracy/fit,; what about the data? Is your data clean, and how do you know that? Our discussion will focus on best practices for data preprocessing for analytic uses. Beginning with essential distributional checks of a dataset to a propose method for automated data validation process during ETL for transactional data.
Optimizing a Data Migration with an AssessmentJulie Champagne
Performing an assessment of the legacy system(s) before embarking on a data migration project may not always be a part of the project plans, but an assessment is extremely valuable in refining the scope of the project, as well as setting up the project for success.
Please join us to review how taking the time to perform a thorough assessment of your legacy data may benefit your organization, both in the short-term and long-term, in helping to address the different nuances of multiple systems.
Learn how an assessment may be the right choice for sending your next data migration down the path to success!
A brief tour of why we focused on building out a data warehouse early on at Clover, and why we think the Data Science function has room to grow in health insurance.
On this slides, we tried to give an overview of advanced Data quality management (ADQM). To understand about DQ why important, and all those steps of DQ management.
Big Data Analytics for Healthcare Decision Support- Operational and ClinicalAdrish Sannyasi
Splunk’s data analytics platform could be utilized to solve many high impact business problems in healthcare delivery systems to reduce cost, improve patient outcome and safety, and enhance care coordination experience. Analyze observed behavior from healthcare event data and metadata to discover patterns, monitor compliance, and optimize the workflow. Furthermore 80% of healthcare data is unstructured (clinical free text and documentation), or semi-structured and many new data sources are such as tele health, mobile health, sensors, and devices are getting integrated in many healthcare systems specifically in the area of chronic disease management. So, one need analytics software that can harvest, interpret, enrich, normalize, and model diverse structured and unstructured data and analytics approaches that embrace the “data turmoil” by relying less on standardized data items and more on the capability to process data in any format.
The Path to Data and Analytics ModernizationAnalytics8
Learn about the business demands driving modernization, the benefits of doing so, and how to get started.
Can your data and analytics solutions handle today’s challenges?
To stay competitive in today’s market, companies must be able to use their data to make better decisions. However, we are living in a world flooded by data, new technologies, and demands from the business for better and more advanced analytics. Most companies do not have the modern technologies and processes in place to keep up with these growing demands. They need to modernize how they collect, analyze, use, and share their data.
In this webinar, we discuss how you can build modern data and analytics solutions that are future ready, scalable, real-time, high speed, and agile and that can enable better use of data throughout your company.
We cover:
-The business demands and industry shifts that are impacting the need to modernize
-The benefits of data and analytics modernization
-How to approach data and analytics modernization- steps you need to take and how to get it right
-The pillars of modern data management
-Tips for migrating from legacy analytics tools to modern, next-gen platforms
-Lessons learned from companies that have gone through the modernization process
Big Data Tools PowerPoint Presentation SlidesSlideTeam
Enhance your audiences knowledge with this well researched complete deck. Showcase all the important features of the deck with perfect visuals. This deck comprises of total of twenty slides with each slide explained in detail. Each template comprises of professional diagrams and layouts. Our professional PowerPoint experts have also included icons, graphs and charts for your convenience. All you have to do is DOWNLOAD the deck. Make changes as per the requirement. Yes, these PPT slides are completely customizable. Edit the colour, text and font size. Add or delete the content from the slide. And leave your audience awestruck with the professionally designed Big Data Tools PowerPoint Presentation Slides complete deck. http://bit.ly/39AwSro
Microsoft: A Waking Giant In Healthcare Analytics and Big DataHealth Catalyst
In 2005, Northwestern Memorial Healthcare embarked upon a strategic Enterprise Data Warehousing (EDW) initiative with the Microsoft technology platform as the foundation. Dale Sanders was CIO at Northwestern and led the development of Northwestern’s Microsoft-based EDW. At that time, Microsoft as an EDW platform was not en vogue and there were many who doubted the success of the Northwestern project. While other organizations were spending millions of dollars and years developing EDW’s and analytics on other platforms, Northwestern achieved great and rapid value at a fraction of the cost of the more typical technology platforms. Now, there are more healthcare data warehouses built around Microsoft products than any other vendor. The risky bet on Microsoft in 2005 paid off.
Ten years ago, critics didn’t believe that Microsoft could scale in the second generation of relational data warehouses, but they did. More recently, many of these same pundits have criticized Microsoft for missing the technology wave du jour in cloud offerings, mobile technology, and big data. But, once again, Microsoft has been quietly reengineering its culture and products, and as a result, they now offer the best value and most visionary platform for cloud services, big data, and analytics in healthcare.
In this context, Dale will talk about:
His up and down journey with Microsoft as an Air Force and healthcare CIO, and why he is now more bullish on Microsoft like never before
A quick review of the Healthcare Analytics Adoption Model and Closed Loop Analytics in healthcare, and how Microsoft products relate to both
The rise of highly specialized, cloud-based analytic services and their value to healthcare organizations’ analytics strategies
Microsoft’s transformation from a closed-system, desktop PC company to an open-system consumer and business infrastructure company
The current transition period of enterprise data warehouses between the decline of relational databases and the rise of non-relational databases, and the new Microsoft products, notably Azure and the Analytic Platform System (APS), that bridge the transition of skills and technology while still integrating with core products like Office, Active Directory, and System Center
Microsoft’s strategy with its PowerX product line, and geospatial analysis and machine learning visualization tools
Microsoft: A Waking Giant in Healthcare Analytics and Big DataDale Sanders
Ten years ago, critics didn’t believe that Microsoft could scale in the second generation of relational data warehouses, but they did. More recently, many of these same pundits have criticized Microsoft for missing the technology wave du jour in cloud offerings, mobile technology, and big data. But, once again, Microsoft has been quietly reengineering its culture and products, and as a result, they now offer the best value and most visionary platform for cloud services, big data, and analytics in healthcare.
The challenges of Analytical Data Management in R&DLaura Berry
Presented at the Global Pharma R&D Informatics Congress. To find out more, visit:
www.global-engage.com
Analytical data is at the heart of pharmaceutical research, yet many organisations struggle with the variety of different formats, instrument vendors, and search and retrieval of data. In this presentation, Hans de Bie from ACD/Labs discusses automated capture, exchange formats, integrity, and next generation management systems.
Denodo and RXP Joint Webinar: 'Use of Aggregated Healthcare Data to Enhance a...Denodo
Watch here: https://bit.ly/3eiCj0q
More than anything else, the COVID-19 crisis has accelerated the focus on the use of enhanced data in our health system. There are a range of enhanced data sets and analytical tools that can be used to emerge stronger on the other side.
Join this session to learn:
How to aggregate health data from multiple data sets, across data sources for fast, powerful insights.
How to use data and analytics to monitor supply chain and make data-driven decisions for maximum efficiency.
How to use analytics tools to monitor patterns in data access to deliver early-warnings of misuse, fraud and improve cyber defence.
Implementar una estrategia eficiente de gobierno y seguridad del dato con la ...Denodo
Watch full webinar here: https://bit.ly/3lSwLyU
En la era de la explosión de la información repartida en distintas fuentes, el gobierno de datos es un componente clave para garantizar la disponibilidad, usabilidad, integridad y seguridad de la información. Asimismo, el conjunto de procesos, roles y políticas que define permite que las organizaciones alcancen sus objetivos asegurando el uso eficiente de sus datos.
La virtualización de datos forma parte de las herramientas estratégica para implementar y optimizar el gobierno de datos. Esta tecnología permite a las empresas crear una visión 360º de sus datos y establecer controles de seguridad y políticas de acceso sobre toda la infraestructura, independientemente del formato o de su ubicación. De ese modo, reúne múltiples fuentes de datos, las hace accesibles desde una sola capa y proporciona capacidades de trazabilidad para supervisar los cambios en los datos.
Le invitamos a participar en este webinar para aprender:
- Cómo acelerar la integración de datos provenientes de fuentes de datos fragmentados en los sistemas internos y externos y obtener una vista integral de la información.
- Cómo activar en toda la empresa una sola capa de acceso a los datos con medidas de protección.
- Cómo la virtualización de datos proporciona los pilares para cumplir con las normativas actuales de protección de datos mediante auditoría, catálogo y seguridad de datos.
¿En qué se parece el Gobierno del Dato a un parque de atracciones?Denodo
Watch full webinar here: https://bit.ly/3Ab9gYq
Imagina llegar a un parque de atracciones con tu familia y comenzar tu día sin el típico plano que te permitirá planificarte para saber qué espectáculos ver, a qué atracciones ir, donde pueden o no pueden montar los niños… Posiblemente, no podrás sacar el máximo partido a tu día y te habrás perdido muchas cosas. Hay personas que les gusta ir a la aventura e ir descubriendo poco a poco, pero cuando hablamos de negocios, ir a la aventura puede ser fatídico...
En la era de la explosión de la información repartida en distintas fuentes, el gobierno de datos es clave para garantizar la disponibilidad, usabilidad, integridad y seguridad de esa información. Asimismo, el conjunto de procesos, roles y políticas que define permite que las organizaciones alcancen sus objetivos asegurando el uso eficiente de sus datos.
La virtualización de datos, herramienta estratégica para implementar y optimizar el gobierno del dato, permite a las empresas crear una visión 360º de sus datos y establecer controles de seguridad y políticas de acceso sobre toda la infraestructura, independientemente del formato o de su ubicación. De ese modo, reúne múltiples fuentes de datos, las hace accesibles desde una sola capa y proporciona capacidades de trazabilidad para supervisar los cambios en los datos.
En este webinar aprenderás a:
- Acelerar la integración de datos provenientes de fuentes de datos fragmentados en los sistemas internos y externos y obtener una vista integral de la información.
- Activar en toda la empresa una sola capa de acceso a los datos con medidas de protección.
- Cómo la virtualización de datos proporciona los pilares para cumplir con las normativas actuales de protección de datos mediante auditoría, catálogo y seguridad de datos.
Maximize Your Understanding of Operational Realities in Manufacturing with Pr...Bigfinite
Maximize Your Understanding of Operational Realities in Manufacturing with Predictive Insights using Big Data, Artificial Intelligence, and Pharma 4.0
by Toni Manzano, PhD, Co-founder and CSO, Bigfinite
PDA Annual Meeting 2020
Keeping the Pulse of Your Data: Why You Need Data Observability Precisely
With the explosive growth of DataOps to drive faster and better-informed business decisions, proactively understanding the health of your data is more important than ever. Data observability is one of the foundational capabilities of DataOps and an emerging discipline used to expose anomalies in data by continuously monitoring and testing data using artificial intelligence and machine learning to trigger alerts when issues are discovered.
Join Paul Rasmussen and Shalaish Koul from Precisely, to learn how data observability can be used as part of a DataOps strategy to prevent data issues from wreaking havoc on your analytics and ensure that your organization can confidently rely on the data used for advanced analytics and business intelligence.
Topics you will hear addressed in this webinar:
Data observability – what is it and how it is different from other monitoring solutions
Why now is the time to incorporate data observability into your DataOps strategy
How data observability helps prevent data issues from impacting downstream analytics
Examples of how data observability can be used to prevent real-world issues
Tackling the Challanges of Pharma ManufacturingJason Corder
By nature, pharmaceutical manufacturing operations are complex, inefficient, and consequently costly. Due to the inherent complexities, the cost of poor efficiency and its root causes are often not well understood by many manufacturers.
FDA News Webinar - Inspection IntelligenceArmin Torres
Developing a Digital Data-Driven Approach to preparing for FDA Inspections. Using Data Analytics to proactively monitor internal and external Quality & Compliance data sources.
Trauma Outpatient Center is a comprehensive facility dedicated to addressing mental health challenges and providing medication-assisted treatment. We offer a diverse range of services aimed at assisting individuals in overcoming addiction, mental health disorders, and related obstacles. Our team consists of seasoned professionals who are both experienced and compassionate, committed to delivering the highest standard of care to our clients. By utilizing evidence-based treatment methods, we strive to help our clients achieve their goals and lead healthier, more fulfilling lives.
Our mission is to provide a safe and supportive environment where our clients can receive the highest quality of care. We are dedicated to assisting our clients in reaching their objectives and improving their overall well-being. We prioritize our clients' needs and individualize treatment plans to ensure they receive tailored care. Our approach is rooted in evidence-based practices proven effective in treating addiction and mental health disorders.
For those battling kidney disease and exploring treatment options, understanding when to consider a kidney transplant is crucial. This guide aims to provide valuable insights into the circumstances under which a kidney transplant at the renowned Hiranandani Hospital may be the most appropriate course of action. By addressing the key indicators and factors involved, we hope to empower patients and their families to make informed decisions about their kidney care journey.
Empowering ACOs: Leveraging Quality Management Tools for MIPS and BeyondHealth Catalyst
Join us as we delve into the crucial realm of quality reporting for MSSP (Medicare Shared Savings Program) Accountable Care Organizations (ACOs).
In this session, we will explore how a robust quality management solution can empower your organization to meet regulatory requirements and improve processes for MIPS reporting and internal quality programs. Learn how our MeasureAble application enables compliance and fosters continuous improvement.
This document is designed as an introductory to medical students,nursing students,midwives or other healthcare trainees to improve their understanding about how health system in Sri Lanka cares children health.
ICH Guidelines for Pharmacovigilance.pdfNEHA GUPTA
The "ICH Guidelines for Pharmacovigilance" PDF provides a comprehensive overview of the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH) guidelines related to pharmacovigilance. These guidelines aim to ensure that drugs are safe and effective for patients by monitoring and assessing adverse effects, ensuring proper reporting systems, and improving risk management practices. The document is essential for professionals in the pharmaceutical industry, regulatory authorities, and healthcare providers, offering detailed procedures and standards for pharmacovigilance activities to enhance drug safety and protect public health.
Health Education on prevention of hypertensionRadhika kulvi
Hypertension is a chronic condition of concern due to its role in the causation of coronary heart diseases. Hypertension is a worldwide epidemic and important risk factor for coronary artery disease, stroke and renal diseases. Blood pressure is the force exerted by the blood against the walls of the blood vessels and is sufficient to maintain tissue perfusion during activity and rest. Hypertension is sustained elevation of BP. In adults, HTN exists when systolic blood pressure is equal to or greater than 140mmHg or diastolic BP is equal to or greater than 90mmHg. The
The Importance of Community Nursing Care.pdfAD Healthcare
NDIS and Community 24/7 Nursing Care is a specific type of support that may be provided under the NDIS for individuals with complex medical needs who require ongoing nursing care in a community setting, such as their home or a supported accommodation facility.
1. 26 Aug 2020
Data Migration in Life Sciences
Krishna veni Rapuru-
Manager,Engineering
2. ● Krishnaveni Rapuru
○ Engineering Manager at Medidata Solutions
○ 12 years of industry experience across Lifescience,
Healthcare,Government & Public sector
○ Leadership experience around 6 years
○ Passion : Technology, love <coding> & working with
people, and continuous learning.
About Me
3. Agenda
❏ About Medidata
❏ What is CTMS & Clinical Data?
❏ Why to take Data Migration Seriously?
❏ Medidata Vision & Purpose on Data Migration
❏ Constraints & Risk
❏ Approach
❏ Challenges & Learnings
❏ Best Practices
4. About Medidata
Medidata is leading the digital transformation of life sciences, creating hope for
millions of patients.
Medidata helps generate the evidence and insights to help pharmaceutical, biotech,
medical device and diagnostics companies, and academic researchers accelerate
value, minimize risk, and optimize outcomes. More than one million registered users
across 1,400 customers and partners access the world's most-used platform for
clinical development, commercial, and real-world data.
Medidata, a Dassault Systèmes company (Euronext Paris: #13065, DSY.PA), is
headquartered in New York City and has offices around the world to meet the needs
of its customers.
7. Brief on CTMS & Clinical Data
Simple terms …. A Clinical Trial Management System (CTMS) is a software system used by
biotechnology and pharmaceutical industries to manage clinical trials in clinical research.
8. Why to take Data Migration Seriously?
Good System + Bad Data = Disaster
• Adoption Problems
• Customer Relationship issues
• Decrease in Revenue Generation
• Analytics problems
9. Vision & Purpose
● Provide a seamless data migration solution for all new & existing
customers who wants to adopt Medidata new generation platform
in industry standard formats.
● Migrate all existing customers from legacy product/s to new
platform to reduce operational costs and to improve the efficiency
for users with single business process.
● Single view of data,streamlined process, intelligent monitoring &
reporting
10. Constraints and Risk
● Loss of Data: Data could be lost in the process of cleaning and/or migrating
between source to destination.
● Time Lag: Could be a gap between when the data is unavailable in the legacy
system and when it is available in the new system
● Time Overlap: Data could be available in two systems before the source or
legacy system is decommissioned
● Loss of Functionality: New CTMS might not have the same functionality as
all combined legacy systems and tools
13. ● Not contacting & communication with key stakeholders early in process
● Lack of Data governance
● Lack of planning
● Lack of Subject Matter Experts & Skill sets
● Waiting for “perfect” specs between source and target systems, instead start
development with 60% - 70% “knowns”
● Lack of Strategy & Execution around testing the migration. Make a choice to
start with Automation & Manual (both).
● Not spending time on dry run with “real” sample source data.
15. ● Understand the data
■ Source data to destination data
■ Assessment of Data Meaning & Quality
● Define Business Process & Project Governance
■ Clear business process with owners
■ Clear roles & responsibilities matrix
● Rollback & Dry Run
■ Define Rollback strategy to mitigate failures
■ Dry run to measure
● The Importance of the Data Mapping Specification Document
■ Source of truth
● Perform comprehensive validation testing
■ Source to Destination data integrity check (Automated & Manual)
■ Risk-based Sampling Strategy
16. Key Takeaways
● CTMS & Clinical Data
● Need for Data Migration Strategy
● Vision & Purpose for CTMS Data Migration
● Constraints & Risk
● Approach
● Challenge & Learnings
● Best Practices
Alright everyone, welcome and thank you so much for joining us today.
.. I am very excited to share, some insights about Data Migration, within the Life Science industry, and how medidata does Clinical data Migration . I promise, all of you get some valuable takeaways from this talk.
Frist, Let me introduce myself ,
I am Krishna, working as an engineering manager, in Medidata solutions for the past 4 years. I started my career as an engineer 12 years ago, and work through different roles across different industries like Healthcare , government and public sectors. I have been leading and managing teams for the past 6 years.
My passion has always been technology, and working with people. I love exploring and learning new things.
On the personal front , I am a mother of 2 crazy kids, and really enjoy spending time with them.
Let's get started! I’ll start with a brief intro about Medidata, then what is CTMS and clinical data means. After I will be doing a deep dive into Clinical data migration, and will be concluding with Best Practices, along with Q&A .
Now let me talk about Medidata for a moment
Medidata is a growing company, and part of DS systems. We are globally distributed company, headquartered in NY with around 2500 employees, and Europe is the fastest growing market within Medidata.
We are really having an impact, by working together, with clients, customers and partners, to provide a better quality of life, to so many human beings. Helping our customers, provide new drugs that cure cancer ,and other serious diseases, is really something that has a positive impact on society, and that really motivates me tremendously day by day.
Now bit more on Medidata...
Top 13/15 drugs sold in 2017, was developed using Medidata Technology.
Every time I read these kind of news, and I see that one of our clients has launched a new drug, I’m obviously very proud, and also very motivated, because all of the drugs, that have been developed by our partners and clients, were touched by Medidata’s platform.
Here you can see, some of the big & major brands, who are part of the Medidata client base.
Before going into the details of migration , let me give you a short brief on, what is CTMS and what classifies as Clinical Data , which is a key pre-step, before we delve into the details of the migration, for this session.
So what is CTMS ? CTMS is a market standard solution provided to CROs, i.e Clinical research organisation and Sponsors, who can conduct intelligent trials, which enable them to rollout drugs to the market.
Now going to the diagram from left to right ,
On the left hand side , we can see that, there are 2 segments of Data stream, which comes into Medidata Clinical Cloud. Medidata Clinical Cloud is a cloud native Application and Data Ecosystem.
First segment being, all the data which comes from electronic devices integration , otherwise called EDC, which is nothing but Electronic Data Capture , which are used in Trail sites,Hospitals and other pharmaceutical organizations. Second segment of data, which we are calling as “Other resources”, comes from lab equipment, sensors, x-ray images and electronic based agreements.
Second key integral of the Medidata echo system is CTMS, which you can see here, CTMS consumes and publishes data, in and out of Medidata Clinical Cloud . It is primarily used by CROs / Sponsors and other specific users within clinical industry , to track and manage,
What study wants to be conducted, where and by whom ? , This is known as Core Study
How many patients (which we call as Subjects) visited a site/location? A.k.a Subject visits
How many of them are enrolled on the trail, track and measure around this ? This is known as Enrollment
What issues are faced, as part of the trial process? Who and how to resolve? This is called Issues & Actions
What is the status , timelines and progress of the trail? Known as Milestones
Lastly, there are regulatory and country compliance requirements and standards, which has to be met as part of this entire process.
By the way for those who are interested ,all this information is available on our website.
Let’s start with why :)
Why it's very important to have a clear DataMigration Strategy, because even if we build a near perfect technology platform, once bad data drives the platform, it can turn out to be a potential disaster and nightmare. This will result in difficult customer adoptions and retention issues, revenue impact and lastly incorrect and high volatile behaviours, which drives inaccurate analytics.
Now let's talk about what's the vision, and purpose behind Data Migration for Medidata, specially around CTMS (1 min)
As part of vision, there are 3 key goals, which we wanted to achieve
How can we provide a seamless, and unified data migration platform, for our existing and new customers?
How to provide an upgrade path for our existing customers, from legacy systems to Medidata Clinical new platform, to reduce cost and improve operational efficiency?
Last and final goal being, providing a single and 360 view of migrated data, to our both internal and external customers, through intelligent monitoring and reporting.
As part of the migration strategy,key areas of risk and constraints, which needs to be considered are,
Potential data loss during migration, because of cleansing, transformation and various mapping activities.
Various time bound factors like, time and duration of overall activity , overlap & sync of old and migrated data , could create data inconsistencies between source and destination, which may potentially lead to data integrity issues as well.
Lastly , Capability / Functionality gap between source and destination systems, it's a “fact” that we can’t do a “like to like” match of all functionalities between source and destination, so it can end up in loose or consolidation of functionality, and data in destination.
Now, let me talk through the approach we have taken, to achieve overall strategy and execution.
As you see here, There are 3 key stages
Analysis
Development
Go live and support
During the Analysis phase, we have set objectives, scope and success criteria, along with identifying and engaging with key stakeholders and partners, which includes customers as well. Then we performed, a detailed data analysis, where we assessed data models between source and destinations, along with data quality .This generated
Data Specification Document,
Quality of data in source systems,
mapping document, which captures data relationships between source and destination and
capturing all customer journeys with respective data flows.
Next key stage is Development, where we did Design,Build and Test.
As part of Design phase,
We came up with high level enterprise architecture.
After that we went through the Buy/Build process, where we did technology/tool selection, based on scale,complexity, ease of adoption,business value, cost and time to market.
Then we Identified and documented all data mapping and translation rules .
During the Build phase, we developed capabilities to extract data from different sources in different formats , transforming the same into a unified format ,which destinations can understand, and also adhere to all mapping and business rules.
Along with transformation we also built the capability, to publish the data into destination systems in certain sequences ,so that data integrity was maintained.
The most important part when we develop this solution, is around the right level of instrumentation, so that we can proactively, monitor, measure the system availability and performance .
Key challenge is not just building the solution, but to have a proper test strategy, to test both sanity tests with smaller data sets, and also volumetric data test.This test strategy was executed ,by a combination of both automation and manual spot test.
Once we have developed the system / solution, and test strategy established, we ran several cycles of test , issue identifications and fix, till we meet our quality acceptance goal .
In addition to system design and testing ,we also built a reporting capability, where user can identify , trace the data, follow and resolve the issues in a timely manner. This in turn, helps to maintain the data integrity between source and destination, and seamless customer experience.
Last stage which is critical stage i.e Go live and support, this is the stage, where we deployed the solution into production in an automated fashion, integrating with a centralised monitoring system.
Before actual customer migration starts, we defined a business process flow, where every customer needs to go through, a pilot run of a real customer data , to check E2E flow with all integrations is working as expected or not.
Once the pilot run successfully completes, then we cleanse the customer data and perform actual migration. This is not a one day job and it’s cycle of activities, which happens over the time, which needs continuous care and support , till we achieve our migration completion goals for each customer.
While ago , I found this image accidently, when we were going through a challenging phase of the project. Believe it or not , this is so true!
No matter how big the migration / how well we plan, there will be always some surprises and challenges. So, I would like to share some of the challenges we went through, and how did we address them,
First one,
“Connect”with key stakeholders. No matter the size of the migration, validate your assumptions with key stakeholders, and explain the impact on them before you get going on the task .If you don’t, it will back bite you at some stage, and that will disrupt your timelines.
Second key one
“Constant Communication” with the business. Once you’ve explained the project to the stakeholders, be sure to keep them informed of your progress. It’s best to provide a status report on a weekly basis, especially if things get off track.
Next being
Data governance. Be sure you’re clear on who has the rights to create, approve, edit, or remove data from the source system, and document that in writing as part of your project plan / define the clear process.
If you ask me, this is the key for success which is
Planning - Do not underestimate the analysis between source and target systems, and understand at least 70% before planning. We went to development with a 40% understanding , and resulted in (good) amount of changes in destination systems, and behaviour in a short time. This also generated rework during the development phase.
Another important one to consider
Having the right skills and expertise. Although this is a straightforward task, there's a lot of complexity involved in moving data. Having an experienced professional, with excellent knowledge on both source and destination, helps the process go smoothly.
Next,
Don’t wait for perfect spec, make a start , and go with 70% readiness. You have to iterate, as you go along to reach the target state.
Last but one
Having a clear Strategy & Execution around testing. We made a choice to go with full automation ,then realised it was too much off upfront cost and slowed us down , then we went with the hybrid approach of, Automation & Manual.
Finally
Not spending a good amount of time, on dry run with “real” sample source data, especially data variations, during development and release cycle.
Lastly,I would like to share, a few best practices that we have been doing. I think it is an important aspect, to cover and give people the opportunity, to gain a little bit from our experience, and understand what is working .
Tip 1 – Understanding the data
Before starting the data migration, you have to prepare your data for the migration, carrying out an assessment of what is present in the source system, understanding clearly which data needs to be migrated.
We can divide the assessment of source system in two macro categories:
Assessment of the data meaning
Assessment of the data quality
Data meaning ,every piece of data, that you move is something that has to be validated, cleaned and transformed. In data migration projects, migrating only relevant data ensures efficiency and cost control.Understanding how the source data, will be used in the target system is necessary, for defining what to migrate.
The second macro area, is the assessment of the quality of the data.
It is very important, to define a process to measure data quality early in the project, The quality analysis, typically leads to a data cleaning activity. Cleaning the source data is a key element of reducing data migration effort.
Tip 2 – Project Governance
The best for approaching a data migration project is, clearly defining roles and responsibilities, and avoiding accountability overlapping. This can be done in several steps:
Define the owner of the data in the target system
Include the business users in decision-making. They understand the history, the structure and the meaning of the source data.
Based on our experience, what makes a difference is the presence of a business analyst. This is a person that acts as a bridge, between the technical staff involved in the technical implementation of the migration, and the businesspeople.
Tip 3 – Roll back & Dry Run
A roll back strategy has to be put in place in order to mitigate risks of potential failures. Access to source data have to be done in read only mode. This prevents any kind of data modification ,and ensures its integrity.
Tip 4 – The Importance of the Data Mapping Specification Document
This document is core of data migration. It ensures a complete field mapping, and it is used to collect all mapping rules and exceptions.This project phase is usually long and tiring, for a number of reasons.
Volume and amount of data details
Technical activity with technical documents
Little knowledge of dynamics of target database
Compromises that have to be made
Some tips I can share to help you to do it in the most efficient way:
Clarify what has to be migrated and what shouldn’t be migrated
Clean source data – this will reduce the number of fields to migrate
Liaise with a business analyst that will translate technical requirements, and help to explain how data will work in the target system
Rely on data migration expert, that have already performed similar data migration in the past
Lastly , Tip number 5 – Perform comprehensive validation testing
To ensure that the goals of the data migration strategy are achieved, a company needs to develop a solid migration verification process. The data migration verification strategy, needs to include ways to prove that, the migration was successfully completed, and data integrity was maintained.
So to quickly recap ,
We have covered, what is CTMS and clinical data , we also talked about Strategy , vision and purpose
Constraints and risks which comes as part of these kind of projects and
What approach we have taken
In the end we also talked about challenges , learnings and best practices.
As Steve jobs says, team comes first . so one pillar is strategy , but the most important factor for success is “team”. I’m extremely happy to be part of such a great team in Medidata , which helped us to achieve this goal. This is just the beginning of a journey.
I hope this session has been very informative to you all !
Thank you..