This one I presented at the HINZ conference 7-9 Nov 2012 at Rotorua, New Zealand.
ABSTRACT:
As we are moving into new paradigms of care, sharing of health information becomes crucial. We need new systems and more interconnectivity to support this. The regional approach to eHealth solutions in New Zealand hinges on establishing trusted and interoperable systems. The Interoperability Reference Architecture is a first step towards providing overall principles and standards to reach this goal. A core group from the Sector Architects Group was formed and prepared the first draft of this document. After initial internal feedback it went through wider consultation – including public. Good feedback was received, including international. It then went through formal HISO processes and was approved as a national interim standard. The Reference Architecture comprises three pillars which define: 1) XDS based access to clinical data repositories, 2) a common content model underpinned by CCR and openEHR Archetypes to which all health information exchange should conform, and 3) use of CDA as common currency for payload. A trial implementation is yet to be conducted, however we used the Content Model to align ePrescribing data model with the Australian model in order to validate the methodology. The Reference Architecture will provide an incremental step-by-step implementation approach to interoperability and thus minimise risk.
CER HUB An Informatics Platform for Conducting Compartive Effectiveness with ...HMO Research Network
The document describes the CER Hub, an informatics platform for conducting comparative effectiveness research using electronic medical record data. The CER Hub allows researchers to develop standardized processors to generate research datasets from heterogeneous EMR systems. It facilitates collaborative projects to address questions like evaluating asthma control and smoking cessation treatments. Initial projects through the CER Hub involve developing measures of asthma control and comparing the effectiveness of treatment intensification options using EMR data from six health systems.
Development of the Gestational Diabetes Registry at CMDHB (New Zealand) using...Koray Atalag
This is the prezo I have at the Australasian Long-Term Conditions Conference in Auckland on 30 Jul 2014. Focus was on prevention and management of long term conditions and use of clinical registries has proven to be effective. This is a pilot project at a large healthcare provider organisation in Auckland (Counties Manukau District Health Board) where we used the full openEHR stack to build web based front end with the OceanEHR backend.
GP2GP In Action - Transferring Patient Records Around New Zealand, Electronic...Peter Jordan
The GP2GP system allows electronic transfer of patient records between general practices in New Zealand. It has led to improved patient satisfaction and care continuity. Uptake has been over 90% within 2.5 years. Over 85% of practices are on track to meet the annual target of 375,000 record transfers between practices. Barriers to adoption include some IT infrastructure limitations. Future work includes allowing unlimited attachment sizes and implementing electronic requests for record transfers.
I gave this prezo to Auckland Regional Clinical IS Leadership Group on Feb 21, 2014. It shows how difficult it can be to deal with certain kinds of health information when developing systems by an impressive example (originally from Dr. Sam Heard). Therefore we need rigorous and scientific methods to tackle this - in this case using openEHR's multi-level modelling approach to create a single content model from which all health information exchange payload definitions will be derived. New Zealand's Interoperability Reference Architecture (HISO 10040) is underpinned by openEHR Archetypes to create this content model. The bottom line of the prezo is that almost every national programme starts health information standardisation from the wrong place; most of them are complex technical speficifications, like CDA, which are almost impossible for clinicians to comprehend and provide feedback. The process is flawed! Instead it should start from simple to understand representations, such as simple diagrams, mindmaps etc.and then handed over to techies once clinical validity and utility is agreed upon.That's the beauty of Archetype approach - great tooling and the Clinical Knowledge Manager (CKM) enable clinicians and other domain experts to collaborate and develop clinical models easily.
What if we never agree on a common health information model?Koray Atalag
In this talk I will touch on some hard problems in health informatics around working with structured data and why we can’t link and reuse them with ease. The essence of the problem is that, while clinicians can perfectly understand each other, IT systems can’t. Traditional IT requires formally defined common terminology, meta-data, data and process definitions. While Medicine is mostly accepted as positive science, yet the great variation in the body of knowledge and practice is often seen as ‘Art’. Ignoring this bit, IT people tend to develop all-inclusive common information models (almost always too complex to implement) and expect everybody adhere to that. Clinicians love to do things a bit differently and of course don’t buy into that! Maybe they are right! Maybe we don’t have to agree on a uniform model at all. This is the basic assumption of the openEHR methodology which I will describe by giving clinical examples. The main premise of this approach is to effectively separate tasks of healthcare and technical professionals. Clinicians can easily define their information needs as they like using visual tools – called Archetypes which are essentially maximal data sets. These computable artefacts, built using a well defined set of technical building blocks, are then fed into the technical environment to integrate data or develop software. Lastly the free web based openEHR Clinical Knowledge Manager portal provides collaborative Archetype development and ensures semantic consistency among different models.
Beating Bugs with Big Data: Harnessing HPC to Realize the Potential of Genomi...Tom Connor
Introducing the HPC challenges associated with developing a set of clinical microbial genomics services in the NHS in Wales. Demonstrating the potential of these technologies, and the impact it is already having for the patients of the Welsh NHS.
CER HUB An Informatics Platform for Conducting Compartive Effectiveness with ...HMO Research Network
The document describes the CER Hub, an informatics platform for conducting comparative effectiveness research using electronic medical record data. The CER Hub allows researchers to develop standardized processors to generate research datasets from heterogeneous EMR systems. It facilitates collaborative projects to address questions like evaluating asthma control and smoking cessation treatments. Initial projects through the CER Hub involve developing measures of asthma control and comparing the effectiveness of treatment intensification options using EMR data from six health systems.
Development of the Gestational Diabetes Registry at CMDHB (New Zealand) using...Koray Atalag
This is the prezo I have at the Australasian Long-Term Conditions Conference in Auckland on 30 Jul 2014. Focus was on prevention and management of long term conditions and use of clinical registries has proven to be effective. This is a pilot project at a large healthcare provider organisation in Auckland (Counties Manukau District Health Board) where we used the full openEHR stack to build web based front end with the OceanEHR backend.
GP2GP In Action - Transferring Patient Records Around New Zealand, Electronic...Peter Jordan
The GP2GP system allows electronic transfer of patient records between general practices in New Zealand. It has led to improved patient satisfaction and care continuity. Uptake has been over 90% within 2.5 years. Over 85% of practices are on track to meet the annual target of 375,000 record transfers between practices. Barriers to adoption include some IT infrastructure limitations. Future work includes allowing unlimited attachment sizes and implementing electronic requests for record transfers.
I gave this prezo to Auckland Regional Clinical IS Leadership Group on Feb 21, 2014. It shows how difficult it can be to deal with certain kinds of health information when developing systems by an impressive example (originally from Dr. Sam Heard). Therefore we need rigorous and scientific methods to tackle this - in this case using openEHR's multi-level modelling approach to create a single content model from which all health information exchange payload definitions will be derived. New Zealand's Interoperability Reference Architecture (HISO 10040) is underpinned by openEHR Archetypes to create this content model. The bottom line of the prezo is that almost every national programme starts health information standardisation from the wrong place; most of them are complex technical speficifications, like CDA, which are almost impossible for clinicians to comprehend and provide feedback. The process is flawed! Instead it should start from simple to understand representations, such as simple diagrams, mindmaps etc.and then handed over to techies once clinical validity and utility is agreed upon.That's the beauty of Archetype approach - great tooling and the Clinical Knowledge Manager (CKM) enable clinicians and other domain experts to collaborate and develop clinical models easily.
What if we never agree on a common health information model?Koray Atalag
In this talk I will touch on some hard problems in health informatics around working with structured data and why we can’t link and reuse them with ease. The essence of the problem is that, while clinicians can perfectly understand each other, IT systems can’t. Traditional IT requires formally defined common terminology, meta-data, data and process definitions. While Medicine is mostly accepted as positive science, yet the great variation in the body of knowledge and practice is often seen as ‘Art’. Ignoring this bit, IT people tend to develop all-inclusive common information models (almost always too complex to implement) and expect everybody adhere to that. Clinicians love to do things a bit differently and of course don’t buy into that! Maybe they are right! Maybe we don’t have to agree on a uniform model at all. This is the basic assumption of the openEHR methodology which I will describe by giving clinical examples. The main premise of this approach is to effectively separate tasks of healthcare and technical professionals. Clinicians can easily define their information needs as they like using visual tools – called Archetypes which are essentially maximal data sets. These computable artefacts, built using a well defined set of technical building blocks, are then fed into the technical environment to integrate data or develop software. Lastly the free web based openEHR Clinical Knowledge Manager portal provides collaborative Archetype development and ensures semantic consistency among different models.
Beating Bugs with Big Data: Harnessing HPC to Realize the Potential of Genomi...Tom Connor
Introducing the HPC challenges associated with developing a set of clinical microbial genomics services in the NHS in Wales. Demonstrating the potential of these technologies, and the impact it is already having for the patients of the Welsh NHS.
Content Modelling for VIEW Datasets Using ArchetypesKoray Atalag
This one also I presented at the HINZ conference.
ABSTRACT:
Use of health information for multiple purposes maximises its value. A good example is PREDICT, a clinical decision support system which has been used in New Zealand for a decade. Collected data are linked and enriched with a number of databases, including national collections, laboratory tests and pharmacy dispensing. We are proposing a new model-driven approach for data management based on openEHR Archetypes for the purpose of improving data linkage and future-proofing of data. The study looks at feasibility of building a content model for PREDICT - a methodology underpinning the Interoperability Reference Architecture. The main premise of the content model will be to provide a canonical model of health information which will be used to align incoming data from other data sources. With this approach it is possible to extend datasets without breaking semantics over long periods of time – a valuable capability for research. The content model was developed using existing archetypes from openEHR and NEHTA repositories. Except for two checklist type items, reused archetypes can faithfully represent the whole PREDICT dataset. The study also revealed we will need New Zealand specific extensions for demographic data. Use of archetype based content modelling can improve secondary use of clinical data.
The document discusses a project to analyze and predict sepsis early using clinical data. It aims to predict sepsis 6 hours before clinical diagnosis to allow for earlier treatment. The author handles missing data and class imbalance in a large dataset. Features are engineered and selected. Decision trees and XGBoost models are used for prediction, achieving partial success. Further research is needed on time-series modeling, feature importance, and model performance with a domain expert.
Implementation and Use of ISO EN 13606 and openEHRKoray Atalag
This was the prezo for the EMBC 2013 tutorial in Osaka, Japan. Intended for an introduction to the standards and technicalities and implementation of openEHR - which is the original formalism.
Big Data Analytics for Treatment Pathways John CaiJohn Cai
This document discusses using real-world big data analytics to understand treatment pathways. It begins by explaining the need for real-world evidence from real-world data to assess effectiveness and outcomes beyond randomized clinical trials. It then describes the volume, variety, and velocity characteristics of real-world big data from sources like claims, EMRs, surveys, and devices. Technical challenges of reconstructing complex patient journeys are discussed. Hadoop and MapReduce are presented as a potential solution by breaking the work into mappers that extract patient data and reducers that organize it into timelines. Examples are given of how this could enable cost, pathway, and outcomes analyses to better inform decision making.
TransCelerate is a nonprofit organization that aims to accelerate medical research by improving collaboration across the pharmaceutical industry. It has developed a Common Protocol Template (CPT) to standardize clinical trial protocols. The CPT provides a streamlined template for protocol content and format to make protocols easier to interpret, reduce complexity and costs, and enable automation. The CPT benefits various stakeholders by improving efficiency and quality. Its adoption by sponsors is valuable as it leverages industry expertise, supports compliance, and balances quality improvements with efficiency gains over time.
Impact Of a Clinical Decision Support Tool on Asthma Patients with Current As...Yiscah Bracha
The document summarizes research on the effect of computerized decision support (CDS) on the percentage of asthma patients with asthma action plans. The research found:
1) Implementation of a CDS tool at clinics led to increases in the percentage of pediatric patients with current asthma action plans, especially at clinics that previously lacked paper templates.
2) For adults, clinics that emphasized asthma action plans and had physicians start using the CDS tool saw increases, while clinics without paper templates saw physicians begin using the tool.
3) Statistical analysis showed the CDS tool had an initial positive effect at one pediatric clinic that oscillated over time, while having no significant effect at other clinics, possibly due to pre-existing tendencies of physicians to
Enabling Clinical Data Reuse with openEHR Data Warehouse EnvironmentsLuis Marco Ruiz
Databases for Clinical Information Systems are difficult to
design and implement, especially when the design should be
compliant with a formal specification or standard. The
openEHR specifications offer a very expressive and generic
model for clinical data structures, allowing semantic
interoperability and compatibility with other standards like
HL7 CDA, FHIR, and ASTM CCR. But openEHR is not only
for data modeling, it specifies an EHR Computational
Platform designed to create highly modifiable future-proof
EHR systems, and to support long term economically viable
projects, with a knowledge-oriented approach that is
independent from specific technologies. Software Developers
find a great complexity in designing openEHR compliant
databases since the specifications do not include any
guidelines in that area. The authors of this tutorial are
developers that had to overcome these challenges. This
tutorial will expose different requirements, design principles,
technologies, techniques and main challenges of implementing
an openEHR-based Clinical Database, with examples and
lessons learned to help designers and developers to overcome the challenges more easily
Enabling Clinical Data Reuse with openEHR Data Warehouse EnvironmentsLuis Marco Ruiz
Modern medicine needs methods to enable access to data,
captured during health care, for research, surveillance,
decision support and other reuse purposes. Initiatives like the
National Patient Centered Clinical Research Network in the
US and the Electronic Health Records for Clinical Research
in the EU are facilitating the reuse of Electronic Health
Record (EHR) data for clinical research. One of the barriers
for data reuse is the integration and interoperability of
different Healthcare Information Systems (HIS). The reason is
the differences among the HIS information and terminology
models. The use of EHR standards like openEHR can alleviate
these barriers providing a standard, unambiguous,
semantically enriched representation of clinical data to
enable semantic interoperability and data integration. Few
works have been published describing how to drive
proprietary data stored in EHRs into standard openEHR
repositories. This tutorial provides an overview of the key
concepts, tools and techniques necessary to implement an
openEHR-based Data Warehouse (DW) environment to reuse
clinical data. We aim to provide insights into data extraction
from proprietary sources, transformation into openEHR
compliant instances to populate a standard repository and
enable access to it using standard query languages and
services
openEHR Approach to Detailed Clinical Models (DCM) Development - Lessons Lear...Koray Atalag
Presented at Health Informatics New Zealand (HINZ 2017) Conference, 1-3 Nov 2017, Rotorua, New Zealand. Based on my Masters student Peter Wei's research. Authorship: Ping-Cheng Wei, Koray Atalag and Karen Day from the University of Auckland.
Health research, clinical registries, electronic health records – how do they...Koray Atalag
This is a talk I gave at my own organisation - National Institute for Health Innovation (NIHI) of the University of Auckland on 6 Aug 2014. Abstract as follows:
In this talk I’ll first cover the topic of clinical registry – an invaluable tool for supporting clinical practice but also gaining momentum in research and quality improvement. NIHI has been very active in this space: we have delivered the prestigious and highly successful National Cardiac Registry (ANZACS-QI) together with VIEW research team and also very recently launched the Gestational Diabetes Registry with Counties Manukau DHB & Diabetes Projects Trust. A few others are in likely to come down the line. This is a huge opportunity for health data driven research and NIHI to position itself as ‘the health data steward’ in the country given our independent status and existing IT infrastructure and “good culture” of working with health data . NIHI’s ‘health informatics’ twist in delivering these projects is how we go about defining ‘information’ – using a scientifically credible and robust methodology: openEHR. This is an international (and now national too) standard to non-ambiguously define health information so that they are easy to understand and also are computable. We build software (even automatically in some cases!) using models created by this formalism. I’ll give basics of openEHR approach and then walk you through how to make sense out of all these. Hopefully you may have an idea about its ‘value proposition’ (as business people call) or Science merit as I like to call it ;)
Michele Tonutti - Scaling is caring - Codemotion Amsterdam 2019Codemotion
A key challenge we face at Pacmed is quickly calibrating and deploying our tools for clinical decision support in different hospitals, where data formats may vary greatly. Using Intensive Care Units as a case study, I’ll delve into our scalable Python pipeline, which leverages Pandas’ split-apply-combine approach to perform complex feature engineering and automatic quality checks on large time-varying data, e.g. vital signs. I’ll show how we use the resulting flexible and interpretable dataframes to quickly (re)train our models to predict mortality, discharge, and medical complications.
This document summarizes a presentation about identifying deficiencies in long-term condition management using electronic medical records. It discusses using data mining of electronic medical records to analyze hypertension management and electronic referrals. Case studies show opportunities for improved monitoring and treatment of long-term conditions were identified. The presentation encourages using available electronic health record data to help improve healthcare processes and outcomes.
Ms. Drury outlines the EHR world for these Davies Winners before ARRA and the EHR Incentive Program existed, sharing the environment and the motivation for these privately owned physician practices who have been recognized by HIMSS as Davies Ambulatory Award Winners. The HIMSS Nicholas E. Davies Award of Excellence recognizes excellence in the implementation and use of health information technology, specifically electronic health records (EHRs), for healthcare organizations, independent physician practices and public health systems. The HIMSS process of evaluating applications from these practices and validating the use and value of HIT is rigorous for the applicants and for the HIMSS Ambulatory Award Committee.
The document discusses an open-source electronic health record (EHR) system called Oscar and describes its architecture and features. It provides examples of how Oscar has been used in radiotherapy settings and primary care clinics. The document also discusses a personal health record (PHR) module called MyOSCAR that is integrated with Oscar. MyOSCAR allows patients to access and share their health records. Two pilot studies are summarized that examine the use of MyOSCAR for blood pressure management and collecting drug safety data from patients. The studies found high completion rates of tasks in MyOSCAR and positive feedback from patients wishing to continue using the application.
"How do Professional Record Standards Support Timely Communication & Information Flows for all Participants in Health & Social Care"? Gurminder khamba (Clinical Lead for Secondary Care, HSCIC) discusses this question at the Healthcare Efficiency Through Technology Expo 2013.
Best Practices in Testing Biometric WearablesValencell, Inc
Wearables and hearables that measure biometric signals like heart rate are different from other devices, because they have to interact with the human body and every human body is different. This makes testing and validation of the devices an important part of the product development process.
Valencell operates one of the most experienced testing labs for biometric wearables and hearables, testing hundreds of devices over thousands of hours of testing every year.
In this presentation for Digital Health Institute Summit 2020 I will explain how we overcame barriers for patient engagement and achieved very high response rates using our ePRO ZEDOC Platform. I'll give real-world insights from a project we ran at the Rheumatology service at NUH in Singapore.
I wear two hats - this talk is with the first one!
Computational Model Discovery for Building Clinical Applications: an Example ...Koray Atalag
This document discusses using computational models and semantic web technologies to enable discovery and reuse of renal transport models. It presents seven exemplar renal transport models at different spatial scales that have been curated and annotated. These models simulate processes like sodium-hydrogen exchange, sodium-glucose cotransport, and thiazide-sensitive sodium-chloride transport. Tools like CellML, SemGen, and Physiome Model Repository are used to encode, annotate, and store the models. The goal is to leverage standards and ontologies to facilitate discovery and integration of these computational models for building clinical applications.
More Related Content
Similar to Underpinnings of the New Zealand Interoperability Reference Architecture
Content Modelling for VIEW Datasets Using ArchetypesKoray Atalag
This one also I presented at the HINZ conference.
ABSTRACT:
Use of health information for multiple purposes maximises its value. A good example is PREDICT, a clinical decision support system which has been used in New Zealand for a decade. Collected data are linked and enriched with a number of databases, including national collections, laboratory tests and pharmacy dispensing. We are proposing a new model-driven approach for data management based on openEHR Archetypes for the purpose of improving data linkage and future-proofing of data. The study looks at feasibility of building a content model for PREDICT - a methodology underpinning the Interoperability Reference Architecture. The main premise of the content model will be to provide a canonical model of health information which will be used to align incoming data from other data sources. With this approach it is possible to extend datasets without breaking semantics over long periods of time – a valuable capability for research. The content model was developed using existing archetypes from openEHR and NEHTA repositories. Except for two checklist type items, reused archetypes can faithfully represent the whole PREDICT dataset. The study also revealed we will need New Zealand specific extensions for demographic data. Use of archetype based content modelling can improve secondary use of clinical data.
The document discusses a project to analyze and predict sepsis early using clinical data. It aims to predict sepsis 6 hours before clinical diagnosis to allow for earlier treatment. The author handles missing data and class imbalance in a large dataset. Features are engineered and selected. Decision trees and XGBoost models are used for prediction, achieving partial success. Further research is needed on time-series modeling, feature importance, and model performance with a domain expert.
Implementation and Use of ISO EN 13606 and openEHRKoray Atalag
This was the prezo for the EMBC 2013 tutorial in Osaka, Japan. Intended for an introduction to the standards and technicalities and implementation of openEHR - which is the original formalism.
Big Data Analytics for Treatment Pathways John CaiJohn Cai
This document discusses using real-world big data analytics to understand treatment pathways. It begins by explaining the need for real-world evidence from real-world data to assess effectiveness and outcomes beyond randomized clinical trials. It then describes the volume, variety, and velocity characteristics of real-world big data from sources like claims, EMRs, surveys, and devices. Technical challenges of reconstructing complex patient journeys are discussed. Hadoop and MapReduce are presented as a potential solution by breaking the work into mappers that extract patient data and reducers that organize it into timelines. Examples are given of how this could enable cost, pathway, and outcomes analyses to better inform decision making.
TransCelerate is a nonprofit organization that aims to accelerate medical research by improving collaboration across the pharmaceutical industry. It has developed a Common Protocol Template (CPT) to standardize clinical trial protocols. The CPT provides a streamlined template for protocol content and format to make protocols easier to interpret, reduce complexity and costs, and enable automation. The CPT benefits various stakeholders by improving efficiency and quality. Its adoption by sponsors is valuable as it leverages industry expertise, supports compliance, and balances quality improvements with efficiency gains over time.
Impact Of a Clinical Decision Support Tool on Asthma Patients with Current As...Yiscah Bracha
The document summarizes research on the effect of computerized decision support (CDS) on the percentage of asthma patients with asthma action plans. The research found:
1) Implementation of a CDS tool at clinics led to increases in the percentage of pediatric patients with current asthma action plans, especially at clinics that previously lacked paper templates.
2) For adults, clinics that emphasized asthma action plans and had physicians start using the CDS tool saw increases, while clinics without paper templates saw physicians begin using the tool.
3) Statistical analysis showed the CDS tool had an initial positive effect at one pediatric clinic that oscillated over time, while having no significant effect at other clinics, possibly due to pre-existing tendencies of physicians to
Enabling Clinical Data Reuse with openEHR Data Warehouse EnvironmentsLuis Marco Ruiz
Databases for Clinical Information Systems are difficult to
design and implement, especially when the design should be
compliant with a formal specification or standard. The
openEHR specifications offer a very expressive and generic
model for clinical data structures, allowing semantic
interoperability and compatibility with other standards like
HL7 CDA, FHIR, and ASTM CCR. But openEHR is not only
for data modeling, it specifies an EHR Computational
Platform designed to create highly modifiable future-proof
EHR systems, and to support long term economically viable
projects, with a knowledge-oriented approach that is
independent from specific technologies. Software Developers
find a great complexity in designing openEHR compliant
databases since the specifications do not include any
guidelines in that area. The authors of this tutorial are
developers that had to overcome these challenges. This
tutorial will expose different requirements, design principles,
technologies, techniques and main challenges of implementing
an openEHR-based Clinical Database, with examples and
lessons learned to help designers and developers to overcome the challenges more easily
Enabling Clinical Data Reuse with openEHR Data Warehouse EnvironmentsLuis Marco Ruiz
Modern medicine needs methods to enable access to data,
captured during health care, for research, surveillance,
decision support and other reuse purposes. Initiatives like the
National Patient Centered Clinical Research Network in the
US and the Electronic Health Records for Clinical Research
in the EU are facilitating the reuse of Electronic Health
Record (EHR) data for clinical research. One of the barriers
for data reuse is the integration and interoperability of
different Healthcare Information Systems (HIS). The reason is
the differences among the HIS information and terminology
models. The use of EHR standards like openEHR can alleviate
these barriers providing a standard, unambiguous,
semantically enriched representation of clinical data to
enable semantic interoperability and data integration. Few
works have been published describing how to drive
proprietary data stored in EHRs into standard openEHR
repositories. This tutorial provides an overview of the key
concepts, tools and techniques necessary to implement an
openEHR-based Data Warehouse (DW) environment to reuse
clinical data. We aim to provide insights into data extraction
from proprietary sources, transformation into openEHR
compliant instances to populate a standard repository and
enable access to it using standard query languages and
services
openEHR Approach to Detailed Clinical Models (DCM) Development - Lessons Lear...Koray Atalag
Presented at Health Informatics New Zealand (HINZ 2017) Conference, 1-3 Nov 2017, Rotorua, New Zealand. Based on my Masters student Peter Wei's research. Authorship: Ping-Cheng Wei, Koray Atalag and Karen Day from the University of Auckland.
Health research, clinical registries, electronic health records – how do they...Koray Atalag
This is a talk I gave at my own organisation - National Institute for Health Innovation (NIHI) of the University of Auckland on 6 Aug 2014. Abstract as follows:
In this talk I’ll first cover the topic of clinical registry – an invaluable tool for supporting clinical practice but also gaining momentum in research and quality improvement. NIHI has been very active in this space: we have delivered the prestigious and highly successful National Cardiac Registry (ANZACS-QI) together with VIEW research team and also very recently launched the Gestational Diabetes Registry with Counties Manukau DHB & Diabetes Projects Trust. A few others are in likely to come down the line. This is a huge opportunity for health data driven research and NIHI to position itself as ‘the health data steward’ in the country given our independent status and existing IT infrastructure and “good culture” of working with health data . NIHI’s ‘health informatics’ twist in delivering these projects is how we go about defining ‘information’ – using a scientifically credible and robust methodology: openEHR. This is an international (and now national too) standard to non-ambiguously define health information so that they are easy to understand and also are computable. We build software (even automatically in some cases!) using models created by this formalism. I’ll give basics of openEHR approach and then walk you through how to make sense out of all these. Hopefully you may have an idea about its ‘value proposition’ (as business people call) or Science merit as I like to call it ;)
Michele Tonutti - Scaling is caring - Codemotion Amsterdam 2019Codemotion
A key challenge we face at Pacmed is quickly calibrating and deploying our tools for clinical decision support in different hospitals, where data formats may vary greatly. Using Intensive Care Units as a case study, I’ll delve into our scalable Python pipeline, which leverages Pandas’ split-apply-combine approach to perform complex feature engineering and automatic quality checks on large time-varying data, e.g. vital signs. I’ll show how we use the resulting flexible and interpretable dataframes to quickly (re)train our models to predict mortality, discharge, and medical complications.
This document summarizes a presentation about identifying deficiencies in long-term condition management using electronic medical records. It discusses using data mining of electronic medical records to analyze hypertension management and electronic referrals. Case studies show opportunities for improved monitoring and treatment of long-term conditions were identified. The presentation encourages using available electronic health record data to help improve healthcare processes and outcomes.
Ms. Drury outlines the EHR world for these Davies Winners before ARRA and the EHR Incentive Program existed, sharing the environment and the motivation for these privately owned physician practices who have been recognized by HIMSS as Davies Ambulatory Award Winners. The HIMSS Nicholas E. Davies Award of Excellence recognizes excellence in the implementation and use of health information technology, specifically electronic health records (EHRs), for healthcare organizations, independent physician practices and public health systems. The HIMSS process of evaluating applications from these practices and validating the use and value of HIT is rigorous for the applicants and for the HIMSS Ambulatory Award Committee.
The document discusses an open-source electronic health record (EHR) system called Oscar and describes its architecture and features. It provides examples of how Oscar has been used in radiotherapy settings and primary care clinics. The document also discusses a personal health record (PHR) module called MyOSCAR that is integrated with Oscar. MyOSCAR allows patients to access and share their health records. Two pilot studies are summarized that examine the use of MyOSCAR for blood pressure management and collecting drug safety data from patients. The studies found high completion rates of tasks in MyOSCAR and positive feedback from patients wishing to continue using the application.
"How do Professional Record Standards Support Timely Communication & Information Flows for all Participants in Health & Social Care"? Gurminder khamba (Clinical Lead for Secondary Care, HSCIC) discusses this question at the Healthcare Efficiency Through Technology Expo 2013.
Best Practices in Testing Biometric WearablesValencell, Inc
Wearables and hearables that measure biometric signals like heart rate are different from other devices, because they have to interact with the human body and every human body is different. This makes testing and validation of the devices an important part of the product development process.
Valencell operates one of the most experienced testing labs for biometric wearables and hearables, testing hundreds of devices over thousands of hours of testing every year.
Similar to Underpinnings of the New Zealand Interoperability Reference Architecture (20)
In this presentation for Digital Health Institute Summit 2020 I will explain how we overcame barriers for patient engagement and achieved very high response rates using our ePRO ZEDOC Platform. I'll give real-world insights from a project we ran at the Rheumatology service at NUH in Singapore.
I wear two hats - this talk is with the first one!
Computational Model Discovery for Building Clinical Applications: an Example ...Koray Atalag
This document discusses using computational models and semantic web technologies to enable discovery and reuse of renal transport models. It presents seven exemplar renal transport models at different spatial scales that have been curated and annotated. These models simulate processes like sodium-hydrogen exchange, sodium-glucose cotransport, and thiazide-sensitive sodium-chloride transport. Tools like CellML, SemGen, and Physiome Model Repository are used to encode, annotate, and store the models. The goal is to leverage standards and ontologies to facilitate discovery and integration of these computational models for building clinical applications.
A Semantic Web based Framework for Linking Healthcare Information with Comput...Koray Atalag
Presented at Health Informatics New Zealand (HINZ 2017) Conference, 1-3 Nov 2017, Rotorua, New Zealand. Authorship: Koray Atalag, Reza Kalbasi, David Nickerson
The University of Auckland
openEHR in Research: Linking Health Data with Computational ModelsKoray Atalag
My prezo at Medinfo 2017 openEHR Developers Workshop.
The aim was to demonstrate how openEHR supports very advanced research and analytics with examples from computational physiology and biosimulation to create patient-specific decision support.
Bringing Things Together and Linking to Health Information using openEHRKoray Atalag
My prezo at Medinfo 2015 Conference in the workshop:
Digital Patient Modeling and Clinical Decision Support by Kerstin Denecke, Stefan Kropf, Claire Chalopin, Mario A, Cypko, Yihan Deng, Jan Gaebel, Koray Atalag
SNOMED Bound to (Information) Model | Putting terminology to workKoray Atalag
Prezo I gave at the HL7 New Zealand FHIR and Ice Seminar (latter referring to SNOMED!). I was asked to talk briefly about how information models relate to terminology and also highlight some other information modelling formalisms and initiatives (e.g. openEHR, ISO/CEN 13606, CIMI and DICOM SR).
Clinical modelling with openEHR ArchetypesKoray Atalag
This is the prezo I used in CellML workshop in Waiheke Island, Auckland, New Zealand on 14 April 2015. The aim was to introduce information modelling with openEHR and how to achieve semantic interoperability by using shared ontologies and clinical terminology.
Linkages to EHRs and Related Standards. What can we learn from the Parallel U...Koray Atalag
This is the prezo I used during the CellML workshop in Waiheke Island, Auckland, New Zealand on 13 April 2015. The aim was to introduce information modelling methods and tools for the purpose of inspiring computational modelling work in the area of semantics and interoperability.
A Standards-based Approach to Development of Clinical Registries - Initial Le...Koray Atalag
This is the prezo I presented at HINZ 2014 conference.
Gestational diabetes has implications for both mother and child with risk of complications during pregnancy, and type 2 diabetes later in life. This paper presents the initial lessons learned from the development of a clinical registry. The aims of the Registry are: 1) 100% successful diabetes screening within 3 months of delivery; 2) Annual type 2 diabetes screening; 3) Early warning in subsequent pregnancies.
We have employed the openEHR standard which underpins our national interoperability reference architecture to represent the dataset and also to build the web-based registry system. Use of this rigorous methodology to tackle health information is expected to ensure semantic consistency of Registry data and maximise interoperability with other Sector projects. The development work has been facilitated by the ability to transform the dataset automatically into software code – ensuring clinical requirements accurately translated into technical terms.
Dataset has been finalised, registry system has been developed and deployed for pilot implementation. Data entry is underway for participants after consenting.
This registry is expected to increase the screening of women leading to earlier detection of diabetes. It should provide a valuable picture of the condition and is intended for extension and wider roll-out after evaluation.
Information Models & FHIR --- It’s all about content!Koray Atalag
In this prezo I have touched upon what an information model is and what is not, especially with relation to terminology. The highlight is to demonstrate the similarities (and differences) between clinical models of openEHR (archetypes & templates) and FHIR. It is obvious that the World doesn't need more standards and a collaborative approach to content development is a necessity. Lastly I make connection with New Zealand's content model approach.
Better Information, Better Care -- Directions for Health IT in New ZealandKoray Atalag
- New Zealand has a national eHealth strategy and plan to achieve high quality healthcare through electronic health information exchange by 2014.
- Key aspects of the plan include implementing electronic medical records in primary care practices and hospitals, developing national clinical registries and systems for medications, oncology and cardiology, and regional health information platforms to enable information sharing across providers.
- New Zealand has strong foundations to support eHealth including a national health identifier used for over 20 years, standardized clinical terminology, and policies promoting privacy and security of personal health information.
New Zealand has a publicly funded healthcare system with universal coverage. It has a national electronic health record system including a unique patient identifier used for over 20 years. Most primary care practices use comprehensive electronic medical record systems integrated with labs and prescribing. Hospitals use integrated clinical workstations and patient administration systems. The national health IT plan aims to achieve high quality integrated care through shared care programs and national clinical systems like ePrescribing. Standards are developed through HISO and openEHR is used to define content and enable data sharing and secondary use through a shared health information platform.
This document discusses using formal modeling techniques like openEHR to improve the maintainability of clinical software. It summarizes research modeling the Minimal Standard Terminology for Digestive Endoscopy (MST) using openEHR archetypes. Implementing change requests from a previous endoscopy application in both the original application and a new one based on openEHR models found the openEHR-based application was significantly easier to maintain. Formal modeling addresses issues with non-standard clinical language and supports semantic interoperability and multilingual requirements.
Why ICT Fails in Healthcare: Software Maintenance and MaintainabilityKoray Atalag
This presentation was for a SERG seminar at the University of Auckland Department of Computer Science. I present why software maintenance is a barrier for adoption of IT in healthcare and the maintainability aspects based on ISO/IEC 9126 software quality standard quality model. I then present the preliminary results of my research here.
Why ICT Fails in Healthcare: Software Maintenance and Maintainability
Underpinnings of the New Zealand Interoperability Reference Architecture
1. Underpinnings of the
Interoperability Reference
Architecture
(HISO 10040)
Koray Atalag1, Alastair Kenworthy2, David Hay3
1.NIHI – University of Auckland
2.Ministry of Health
3.Orion Health
2.
3. The Problem
• Patient centred integrated/shared care paradigms
hinge on more interconnectivity
• We all know about silos: 1+1 >2 when shared
• It’s all about People, processes and technology
• Standards crucial – but need an overarching framework
– No one size fits all! depends on needs, resources
– Myriad of standards, methods etc.
– Not so much success so far worldwide
• Narrow opportunity window in NZ to enable sector-
wide consistency & interoperability
(too many projects in-early flight or kicking off)
4. State of the world
• US: advanced provider-centric systems but little inter-
connectivity (HL7 v2/CDA)
• Canada: CHI providing leadership & standards
(v2/v3/CDA)
• UK: bootstrapping from CfH disaster, focus on high
value/established systems (HL7/13606)
• Nordic: well established, (↑13606 / HL7 v2/CDA)
• EU: very patchy – HL7/↑13606/openEHR
• Asia: patchy -propriety / HL7 / little 13606/openEHR
• Brazil/Paraguay: mainly openEHR & HL7 v2/CDA
• Australia: Nehta/PCEHR, v2/v3/CDA & openEHR
5. State of the nation
• Core EHR by 2014 – are we getting there?
• National planning, regional implementations
• Shared Care and PrimarySecondary
– Shared care projects: long term
conditions, maternity, well child etc.
• Clinical Data Repository (CDR) as enabler
– GP2GP, Transfer of Care, eMedications
– Medicines reconciliation, specialist CIS
– Others: NZULM, new NHI/HPI
• Good emphasis & support for standards
6. The Principles
1. Align to national strategy: as per national and regional plans
2. Invest in information: use a technology agnostic common
content model, and use standard terminologies
3. Use single content model: information for exchange will be
defined and represented in a single consistent way
4. Align to business needs: prioritise the Reference Architecture
in line with regional and national programmes
5. Work with sector: respect the needs of all stakeholders
6. Use proven standards: adopt suitable and consistent national
and international standards wherever they exist (in preference to
inventing new specifications)
7. Use a services approach: move the sector from a messaging
style of interaction to one based on web services
9. What is ECM?
• IT IS A REFERENCE LIBRARY - for enabling consistency in HIE
Payload
• Superset of all clinical dataset definitions
– normalised using a standard EHR record organisation (aka DCM)
– Expressed as reusable and computable models – Archetypes
• Top level organisation follows CCR*
• Further detail provided by:
– Existing relevant sources (CCDA, Nehta, epSoS, HL7 FHIR etc.)
– Extensions (of above) and new Archetypes (NZ specific)
• Each HIE payload (CDA) will correspond to a subset (and
conform)
* kind of – CCDA may be more appropriate
12. ECM Working Principle
Exchange Content Model
Conforms to
Message
Payload
(CDA)
Source System Recipient System
Map Map
Source to Web Service ECM to
ECM Recipient
Exchange
Data
Object
Source data Recipient data
13. Authoring & HISO process
• Initiated & funded by Health Sector Architects Group
(SAG), an advisory group to the NHITB
• 4 co-authors – from Interoperability WG
• Initial feedback from SAG then publish on HIVE
• ABB produced - condensed version of IRA (2011)
• Public comment and evaluation panel October 2011
• Ballot round February 2012
• Interim standard April 2012
• Trial implementation with Northern DHBs, 2012/13
14. Archetypes
• The way to go for defining clinical content
CIMI (led by S. Huff @ Intermountain & Mayo)
In many nat’l programmes (eg. Sweden, Slovenia, Australia, Brazil)
• Smallest indivisible units of clinical information with clinical context
• Brings together building blocks from Reference Model (eg. record
organisation, data structures, types)
• Puts constraints on them:
– Structural constraints (List, table, tree, clusters)
– What labels can be used
– What data types can be used
– What values are allowed for these data types
– How many times a data item can exist?
– Whether a particular data item is mandatory
– Whether a selection is involved from a number of items/values
15. Logical building blocks of EHR
EHR
Folders
Compositions
Sections
Entries
Clusters
Elements
Data values
17. Extending ECM
• Addition of new concepts
• Making existing concepts more specific
– powerful Archetype specialisation mechanism:
– Lab result > HbA1C result, Lipid profiles etc.
Problem First level specialisation
Text or Coded Term Diagnosis Second level specialisation
Clinical description
Date of onset Coded Term Diabetes
Date of resolution + diagnosis
No of occurrences Grading +
Diagnostic criteria Diagnostic criteria
Stage Fasting > 6.1
GTT 2hr > 11.1
Random > 11.1
19. Case Study: Medication
• Essential to get it right – first in patient safety!
• Single definition of Medication will be reused in many
places, including:
– ePrescribing
– My List of Medicines
– Transfer of care
– Health (status & event) summary
– Specialist systems
– Public Health / Research
• Currently no standard def in NZ
(coming soon 10043 Connected Care)
• NZMT / NZULM & Formulary > bare essentials
20. Current state & projects
• PMS: each vendor own data model
• GP2GP: great start for structure
• NZePS: started with propriety model, now waiting
for standard CDA.
– PMS vendors implementing Toolkit based Adapter
• Hospitals: some using CSC MedChart
• Pharmacies?
• Others?
Actually we’re not doing too bad
21. Why bother?
(with a standard structured Medication definition)
“If you think about the seemingly simple concept of
communicating the timing of a medication, it readily
becomes apparent that it is more complex than most
expect…”
“Most systems can cater for recording ‘1 tablet 3 times a
day after meals’, but not many of the rest of the
following examples, ...yet these represent the way
clinicians need to prescribe for patients...”
Dr. Sam Heard
22. Medication timing
Dose frequency Examples
every time period …every 4 hours
n times per time period …three times per day
n per time period …2 per day
…6 per week
every time period range …every 4-6 hours,
…2-3 times per day
Maximum interval …not less than every 8 hours
Maximum per time period …to a maximum of 4 times per
day
Acknowledgement: Sam Heard
23. Medication timing cont.
Time specific Examples
Morning and/or lunch and/or …take after breakfast and
evening lunch
Specific times of day 06:00, 12:00, 20:00
Dose duration
Time period …via a syringe driver over 4
hours
Acknowledgement: Sam Heard
24. Medication timing cont.
Event related Examples
After/Before event …after meals
…before lying down
…after each loose stool
…after each nappy change
n time period before/after …3 days before travel
event
Duration n time period …on days 5-10 after
before/after event menstruation begins
Acknowledgement: Sam Heard
25. Medication timing – still cont.
Treatment duration Examples
Date/time to date/time 1-7 January 2005
Now and then repeat after n …start, repeat in 14 days
time period/s
n time period/s …for 5 days
n doses …Take every 2 hours for 5 doses
Acknowledgement: Sam Heard
26. Medication timing – even more!
Triggers/Outcomes Examples
If condition is true …if pulse is greater than 80
…until bleeding stops
Start event …Start 3 days before travel
Finish event …Apply daily until day 21 of
menstrual cycle
Acknowledgement: Sam Heard
27. Modelling Medication Definition
• NZePS data model (v1.9) & draft 10043
Connected Care CDA templates
• Start from Nehta ePrescribing model
– Analyse models and match data elements
– Extend where necessary as per NZ requirements
• Add new items or rename existing
• Tighter constrains on existing items (e.g.
cardinality, code sets, data types)
31. Results & Outlook
• Extended model 100% covering NZePS
(community ePrescribing)
• Must consider secondary care
• Need to look in more detail:
– Consolidated CDA
– epSoS (European framework)
– Other nat’l programmes
• Generate Payload CDA using transforms
32. Value Proposition
• Content is ‘clinician’s stuff’ – not techy; yet most existing standards are
meaningless for clinicians and vice versa for techies
– Archetypes in ‘clinical’ space – easily understood & authored by them
• Single source of truth for entire sector
– One agreed way of expressing clinical concepts – as opposed to
multiple ways of doing it with HL7 CDA (CCDA is a good first step)
• Archetypes can be transformed into numerous formats – including CDA
• Archetypes are ‘maximal datasets’
– Much easier to agree on
• Scope not limited to HIE but whole EHR; workflow supported
• ECM principle invest in information fulfilled completely
– future proof content today for tomorrow’s implementation technology
(e.g. FHIR etc., distributed workflows etc.)
33. Thank you – Questions?
Empowered by openEHR - Clinicians in the Driver’s Seat!
Editor's Notes
These are the three building blocks – or pillars – of the HISO 10040 series that embodies the central ideas of the Reference Architecture for Interoperability10040.1 is about regional CDRs and transport10040.2 is about a content model for information exchange, shaped by the generic information model provided by CCR, with SNOMED as the default terminology, and openEHR archetypes as the chief means of representation10040.3 is about CDA structured documents as the common currency of exchange – not every single transaction type, but the patient information-laden ones
Published by HISO (2012); Part of the Reference Architecture for Interoperability“To create a uniform model of health information to be reused by different eHealth Projects involving HIE”Consistent, Extensible, Interoperable and Future-Proof Data
Content is ‘clinician’s stuff’ – not techy; yet most existing standards are meaningless for clinicians and vice versa for techiesopenEHR Archetypes are in ‘clinical’ space – easily understood and authored by themArchetypes can be transformed into numerous formats – including CDAArchetypes are ‘maximal datasets’ e.g. They are much more granular than other models when needed. Support more use cases – indeed almost anything to do with EHR (including some workflow). Scope not limited to HIE but whole EHR.One agreed way of expressing clinical concepts – as opposed to multiple ways of doing it with HL7 CDA (CCDA is a good first step though)ECM invest in information fulfilled completely – future proof technology today with ECM for tomorrow’s implementation technology (e.g. FHIR etc., distributed workflows etc.)
... And more
... And more
... And more
Objective of this demo is to show the bottom-up content development approach.Certain Archetypes shared by key HIE (eRef, ePrescribing, PREDICT) undergo an iterative localisation processInternational > Multiple Local projects (added & extended) > Added to ECM