Presentazione dello speech tenuto da Duccio Cocchi (Researcher - Università degli Studi di Firenze) e Claudio Carpini (Direction Assistant - Azienda Ospedaliero-Universitaria Careggi Firenze) intitolata "Practical application of simulation models at Careggi university hospital" , durante il Decision Science Forum 2019, il più importante evento italiano sulla Scienza delle Decisioni.
This document describes the development of an electronic workflow system called scope to improve surgical practice at a District Health Board (DHB) hospital. The goals were to seamlessly map the patient journey, accurately collect coded data, and leverage trusted data to inform clinicians. The system streamlines waiting lists, captures accurate operating notes, and facilitates morbidity and mortality meetings. Implementation across surgical specialties has achieved good compliance and uptake. Preliminary results found increased quality of notes, discussion of complications, and potential to change practice through advanced data analysis. In conclusion, scope has replaced a disconnected paper system with a seamless electronic solution that fully captures standardized data to improve surgical outcomes.
The Diabetes Discovery Project at Austin Health aimed to use their Cerner EMR system to routinely test HbA1c levels on inpatients over 54 to identify undiagnosed and poorly controlled diabetes. Testing of over 5,000 patients found 5% had undiagnosed diabetes and 29% had known diabetes. Higher HbA1c levels were associated with increased hospital admissions and longer lengths of stay for surgical patients. The project demonstrated using health IT to identify diabetes management opportunities. Ongoing work includes refining protocols and expanding to other patient populations.
Analytical and post-analytical errors can occur in clinical chemistry laboratories. Analytical errors include issues like test systems not being calibrated properly, controls being out of range but results still reported, improper measurements or reagents, and instrument maintenance issues. This can lead to inaccuracies, imprecisions, insensitivities, and linearity problems. Post-analytical errors involve things like transcription mistakes in reporting results, reports going to the wrong location, illegible reports, or reports not being sent at all. Laboratories should develop systematic workflows, continuously monitor for errors, and strengthen defenses to minimize vulnerabilities and their impacts, which can include inadequate patient care, misdiagnosis, harm, or even death.
Opening Keynote"From Patient to Population: Providing Optimal Care - The Role for Technology"
Ronald Paulus, MD, MBA
President & CEO
Mission Health System
Dr Nic Woods discusses tools for early recognition and management of sepsis using the electronic medical record (EMR). Sepsis poses a major global health challenge and burden. Tools discussed include a sepsis predictive model built into the EMR that can detect signs of sepsis with sensitivities of 68-91% and specificities of 91-97.6%. Clinical decision support and workflows in the EMR are also used to alert clinicians and guide treatment. Evaluations found these tools helped reduce mortality from sepsis by 4.2-17% and lower length of hospital stays. Key points emphasized that predictive models integrated into clinical workflows can positively impact outcomes, but more progress is still needed.
PCMH: Part 4 – Learn How to Start or Improve Your Quality Improvement ProgramJulie Champagne
We wrap up our PCMH series with a deep dive into Standard 5-Care Coordination and Care Transitions and Standard 6- Performance Measurement and Quality Improvement. How are you handling referrals and transitions of care today? Do you need to make changes to optimize the process? We’ll review care coordination elements and factors as well as the performance improvement standards, elements, and associated factors in this webinar to complete your practice’s PCMH transformation!
This document outlines a study conducted at Magee Rehabilitation's Gaspar Clinic from July to August 2013. It collected data on patient wait times and processes to establish benchmarks. It then used simulation to model potential workflow improvements by eliminating wait times, in order to increase efficiency and satisfaction while reducing costs. The benefits of simulation discussed include establishing norms, improving workflows, eliminating waste, increasing revenue and decreasing costs through increased volume and productivity. Results recorded in the simulation included patient and state histories, activity details, and wait times.
This document discusses laboratory errors, their causes, and ways to prevent them. It notes that errors can occur at any stage of the testing process, from specimen collection to reporting of results. The majority of errors are pre-analytical or post-analytical. Common causes of errors include inadequate staffing, poor quality control, time pressures, and lack of validation of tests. Errors are classified as latent due to organizational failures or infrastructure issues, or active including pre-analytical, analytical, and post-analytical errors. Preventing errors requires measures such as staff education, adherence to standards and procedures, quality monitoring, and effective communication across departments. Reducing errors is important for accurate diagnosis and treatment of patients
This document describes the development of an electronic workflow system called scope to improve surgical practice at a District Health Board (DHB) hospital. The goals were to seamlessly map the patient journey, accurately collect coded data, and leverage trusted data to inform clinicians. The system streamlines waiting lists, captures accurate operating notes, and facilitates morbidity and mortality meetings. Implementation across surgical specialties has achieved good compliance and uptake. Preliminary results found increased quality of notes, discussion of complications, and potential to change practice through advanced data analysis. In conclusion, scope has replaced a disconnected paper system with a seamless electronic solution that fully captures standardized data to improve surgical outcomes.
The Diabetes Discovery Project at Austin Health aimed to use their Cerner EMR system to routinely test HbA1c levels on inpatients over 54 to identify undiagnosed and poorly controlled diabetes. Testing of over 5,000 patients found 5% had undiagnosed diabetes and 29% had known diabetes. Higher HbA1c levels were associated with increased hospital admissions and longer lengths of stay for surgical patients. The project demonstrated using health IT to identify diabetes management opportunities. Ongoing work includes refining protocols and expanding to other patient populations.
Analytical and post-analytical errors can occur in clinical chemistry laboratories. Analytical errors include issues like test systems not being calibrated properly, controls being out of range but results still reported, improper measurements or reagents, and instrument maintenance issues. This can lead to inaccuracies, imprecisions, insensitivities, and linearity problems. Post-analytical errors involve things like transcription mistakes in reporting results, reports going to the wrong location, illegible reports, or reports not being sent at all. Laboratories should develop systematic workflows, continuously monitor for errors, and strengthen defenses to minimize vulnerabilities and their impacts, which can include inadequate patient care, misdiagnosis, harm, or even death.
Opening Keynote"From Patient to Population: Providing Optimal Care - The Role for Technology"
Ronald Paulus, MD, MBA
President & CEO
Mission Health System
Dr Nic Woods discusses tools for early recognition and management of sepsis using the electronic medical record (EMR). Sepsis poses a major global health challenge and burden. Tools discussed include a sepsis predictive model built into the EMR that can detect signs of sepsis with sensitivities of 68-91% and specificities of 91-97.6%. Clinical decision support and workflows in the EMR are also used to alert clinicians and guide treatment. Evaluations found these tools helped reduce mortality from sepsis by 4.2-17% and lower length of hospital stays. Key points emphasized that predictive models integrated into clinical workflows can positively impact outcomes, but more progress is still needed.
PCMH: Part 4 – Learn How to Start or Improve Your Quality Improvement ProgramJulie Champagne
We wrap up our PCMH series with a deep dive into Standard 5-Care Coordination and Care Transitions and Standard 6- Performance Measurement and Quality Improvement. How are you handling referrals and transitions of care today? Do you need to make changes to optimize the process? We’ll review care coordination elements and factors as well as the performance improvement standards, elements, and associated factors in this webinar to complete your practice’s PCMH transformation!
This document outlines a study conducted at Magee Rehabilitation's Gaspar Clinic from July to August 2013. It collected data on patient wait times and processes to establish benchmarks. It then used simulation to model potential workflow improvements by eliminating wait times, in order to increase efficiency and satisfaction while reducing costs. The benefits of simulation discussed include establishing norms, improving workflows, eliminating waste, increasing revenue and decreasing costs through increased volume and productivity. Results recorded in the simulation included patient and state histories, activity details, and wait times.
This document discusses laboratory errors, their causes, and ways to prevent them. It notes that errors can occur at any stage of the testing process, from specimen collection to reporting of results. The majority of errors are pre-analytical or post-analytical. Common causes of errors include inadequate staffing, poor quality control, time pressures, and lack of validation of tests. Errors are classified as latent due to organizational failures or infrastructure issues, or active including pre-analytical, analytical, and post-analytical errors. Preventing errors requires measures such as staff education, adherence to standards and procedures, quality monitoring, and effective communication across departments. Reducing errors is important for accurate diagnosis and treatment of patients
Pathology Optimisation in Chronic Blood Disease MonitoringAndrew O'Hara
Richard Croker shows how an innovative approach to service redesign can improve patient outcomes at pace and scale through the safe and effective use of testing at NHS Northern, Eastern and Western Devon CCG.
The document summarizes research on how nurses spend their time and what prevents them from being at the bedside. It found that nurses spend significant time on communication, walking, and documentation rather than direct patient care activities. Common obstacles preventing nurses from their work included noisy environments, distractions, lack of supplies and long wait times. The document then introduces Wallaroo with Isonas as an access control solution that could help by reducing walking time, streamlining medication administration and documentation to allow nurses to spend more time with patients. It describes Isonas technology as providing networked access control that could help optimize nurses' workflows and time spent on critical tasks if implemented.
This document discusses laboratory errors, their causes, types, and impacts. It describes that errors can occur in the pre-analytical, analytical, and post-analytical phases of testing and provide examples of errors in each phase. Errors are categorized as either determinate (systemic) errors, which are reproducible and can be identified and corrected, or indeterminate (random) errors, which are caused by uncontrollable variables and cannot be eliminated. The key goals are improving precision by reducing indeterminate errors and improving accuracy by reducing determinate errors.
The document discusses various methods for measuring employee performance and conducting performance reviews. It describes establishing objectives and standards for measuring performance through task analysis and quality control. Actual performance is then measured and compared to the standards. If a mismatch is found, the head nurse is responsible for addressing it. The document also outlines structuring performance reviews, giving feedback, and setting goals.
What do clinicians need to know about lab tests?Ola Elgaddar
A presentation in the Annual meeting of the Egyptian American Scholars (AEAS) in Cairo 2015.
I am trying here to describe, in short, from my point of view as a laboratorian, the points that we need to discuss with clinicians. Both groups should share some terms and definitions and should see things from the same perspective!
Here are the steps to calculate sensitivity, specificity, positive predictive value, negative predictive value, and efficiency for the given diagnostic test:
* True Positives (TP) = Number of HIV+ samples correctly identified as positive by the test = 120 - 15 = 105
* True Negatives (TN) = Number of HIV- samples correctly identified as negative by the test = 300 - (120 + 4) = 176
* False Positives (FP) = Number of HIV- samples incorrectly identified as positive by the test = 4
* False Negatives (FN) = Number of HIV+ samples incorrectly identified as negative by the test = 15
Sensitivity = TP / (TP + FN) = 105 / (105
Troubleshooting QC Problems: Your QC has failed, what do you do next?Randox
Randox Quality Control's next 'Improving Laboratory Performance Through Quality Control' educational guide has been published with helpful tips that your laboratory can use in order to ensure it has effective troubleshooting procedures in place.
So you ran QC this morning and realised that one of your analytes has been flagged as 'out-of-control', what do you do next? Do you ignore the warning and continue patient testing, repeat the control until it's within range or do you halt patient testing and investigate the source of the error?
When it comes to troubleshooting QC errors, unfortunately there is no easy path to take. However, it's important that you have standard operating procedures in place, outlining what to do in the event of an out-of control error. Errors occur in laboratories all over the world. A lab with effective troubleshooting procedures in place will still have errors but will be able to detect them, quickly reducing their impact and reducing the risk of wasting both time and money.
Quality control and quality assurance are important for ensuring accurate lab results. Quality control involves regularly testing controls of known values to monitor a test's performance. Key quality control statistics include the mean and standard deviation, which are used to calculate control limits on a Levey-Jennings chart. Westgard rules provide standards for determining when a test is out of control based on control values. Quality assurance encompasses the overall programs and procedures that a lab follows to ensure accurate and reliable results. It has strategic, tactical, and operational levels.
In the continuous quality journey, Controlling laboratory Errors is an integral part & focusing on analytical, post-analytical process is the first step. Developing a reporting culture followed by thorough analysis and implementation of appropriate corrective, preventive actions is required.
Critical Care Research: Connection to PracticeAllina Health
1) The document discusses a critical care research program at Abbott Northwestern Hospital with the goals of conducting studies to improve patient outcomes, enhance quality of care, and reduce costs.
2) The program involves intensivists, hospitalists, and other clinical specialties conducting studies and presenting findings to improve practice.
3) Several ongoing studies are summarized that examine issues like postoperative monitoring, pulmonary ultrasound scoring, infection risks, and outcomes after procedures.
This document provides an overview of quality control in clinical biochemistry laboratories. It discusses that quality control aims to ensure test results are correct by minimizing errors. Errors can occur in the pre-analytical, analytical, and post-analytical phases. The pre-analytical phase, involving sample collection and handling, accounts for most errors. Laboratories use internal quality control methods like calibration, controls, and Levey-Jennings charts daily, as well as external quality assurance programs, to monitor performance and identify errors. Maintaining quality control is important for generating accurate, reliable test results.
This document outlines an agenda and case studies for a healthcare analytics bootcamp. The bootcamp will use healthcare data to develop machine learning solutions to predict heart disease and identify high-risk patients. Case Study 1 will involve exploratory data analysis of tuberculosis data to analyze global trends, hotspots, and mortality rates. Case Study 2 will use a heart disease screening dataset and logistic regression to build a model to predict heart disease risk and develop treatment plans for high-risk patients. The document discusses the types of structured and unstructured healthcare data, sources of data, and applications of machine learning in healthcare analytics.
This document discusses quality control, quality assurance, and quality assessment in medical laboratories. It defines key terms like quality control, quality assurance, and quality assessment. Quality control refers to analytical measurements used to assess data quality, while quality assurance is an overall management plan to ensure data integrity. Quality assessment determines the quality of results generated by evaluating internal and external quality programs. The document outlines quality assurance and quality control processes like standard operating procedures, equipment and reagent validation, personnel competency, and documentation. It also discusses error types, control chart interpretation, and Westgard rules for evaluating quality control results.
Understand what healthcare analytics is.
Identify the 5-stage Analytics Program Lifecycle (APL).
Understand how data analytics can be used in healthcare.
Check it on Experfy: https://www.experfy.com/training/courses/introduction-to-healthcare-analytics.
The document summarizes the results of a survey of Clinical Skills Trainers (CSTs) in London that was conducted to review service delivery models and identify areas for improvement. Key findings included:
- Job descriptions and sessional commitments varied between CSTs, with most having no formal job plan.
- Frequency of image review and feedback to screeners ranged from monthly to ad hoc.
- Training and equipment checking practices among CSTs were also inconsistent.
- Monitoring of CST workload was done informally by most respondents.
1) The webinar discusses the use of statistical process control (SPC) in healthcare delivery system research and quality improvement. SPC uses control charts to monitor processes and outcomes over time and distinguish normal variation from meaningful changes.
2) An example is provided of how SPC was used to monitor disease symptoms electronically during the 2002 Salt Lake City Olympics to detect bioterrorism attacks. Near real-time SPC of clinic data allowed quick identification of a significant increase in gastrointestinal issues from influenza.
3) SPC can also be applied at the patient, provider, and population levels. An example pilot study used SPC to monitor the effects of a clinical workflow redesign emphasizing the medical assistant's
The document discusses using data and analytics to drive improvements in healthcare. It outlines the components of a data-driven organization, including an enterprise data warehouse, metrics, predictive models, protocols, and governance. It also discusses how analytics can help healthcare providers transition to value-based payments by measuring quality, reducing variation, and eliminating waste. Specific examples are provided on how one healthcare system used data to reduce variation in spine care, lower bleeding complications after PCI procedures, identify drug cost opportunities in knee replacements, and lower supply costs for lumbar fusion procedures.
This document outlines a real-world knowledge translation approach used in Alberta, Canada to facilitate evidence-informed decision making about robot-assisted surgery (RAS). It describes establishing committees to guide the re-evaluation of RAS, identifying current RAS procedures and gaps in evidence, developing strategies for data collection and a training/credentialing process, commissioning an economic analysis, and engaging patients. The overall goal is to ensure RAS technologies are implemented responsibly based on accurate local data and with oversight of costs, outcomes, and impacts on the health system and population health needs.
IV Congresso Internacional CBA2017
Emerging Technologies and the Quality of Care
David W. Bates, MD, MSc, Chief, Division of General Internal Medicine, Brigham and Women’s Hospital, Past President, ISQua
The document summarizes lean process improvements implemented in a histopathology department. Broad aims were to share lean solutions, debate strengths/weaknesses, and network. The presentation describes implementing standardized work templates to reduce movement and errors, optimizing layout to reduce travel distance, and introducing a pull system for consultant reporting to better match workload to availability. Measurable outcomes included a 43% reduction in specimen checks, 92% of work reported within 7 days (up from 50%), and decreased consultant reporting time from 4.5 to 1.8 days.
The document summarizes lean process improvements implemented in a histopathology department. Broad aims included sharing lean solutions, debating strengths/weaknesses, and networking. Specific changes included:
1) Mapping processes to identify waste like movement and waiting, then standardizing work like using templates to reduce movement and errors.
2) Optimizing the laboratory layout to reduce distance traveled and support continuous specimen flow.
3) Introducing a "pull" system where consultants pull batches based on capacity rather than being "pushed" work, improving turnaround times.
These changes reduced non-value added time and waste, supporting the goals of faster reporting and improved patient care.
Pathology Optimisation in Chronic Blood Disease MonitoringAndrew O'Hara
Richard Croker shows how an innovative approach to service redesign can improve patient outcomes at pace and scale through the safe and effective use of testing at NHS Northern, Eastern and Western Devon CCG.
The document summarizes research on how nurses spend their time and what prevents them from being at the bedside. It found that nurses spend significant time on communication, walking, and documentation rather than direct patient care activities. Common obstacles preventing nurses from their work included noisy environments, distractions, lack of supplies and long wait times. The document then introduces Wallaroo with Isonas as an access control solution that could help by reducing walking time, streamlining medication administration and documentation to allow nurses to spend more time with patients. It describes Isonas technology as providing networked access control that could help optimize nurses' workflows and time spent on critical tasks if implemented.
This document discusses laboratory errors, their causes, types, and impacts. It describes that errors can occur in the pre-analytical, analytical, and post-analytical phases of testing and provide examples of errors in each phase. Errors are categorized as either determinate (systemic) errors, which are reproducible and can be identified and corrected, or indeterminate (random) errors, which are caused by uncontrollable variables and cannot be eliminated. The key goals are improving precision by reducing indeterminate errors and improving accuracy by reducing determinate errors.
The document discusses various methods for measuring employee performance and conducting performance reviews. It describes establishing objectives and standards for measuring performance through task analysis and quality control. Actual performance is then measured and compared to the standards. If a mismatch is found, the head nurse is responsible for addressing it. The document also outlines structuring performance reviews, giving feedback, and setting goals.
What do clinicians need to know about lab tests?Ola Elgaddar
A presentation in the Annual meeting of the Egyptian American Scholars (AEAS) in Cairo 2015.
I am trying here to describe, in short, from my point of view as a laboratorian, the points that we need to discuss with clinicians. Both groups should share some terms and definitions and should see things from the same perspective!
Here are the steps to calculate sensitivity, specificity, positive predictive value, negative predictive value, and efficiency for the given diagnostic test:
* True Positives (TP) = Number of HIV+ samples correctly identified as positive by the test = 120 - 15 = 105
* True Negatives (TN) = Number of HIV- samples correctly identified as negative by the test = 300 - (120 + 4) = 176
* False Positives (FP) = Number of HIV- samples incorrectly identified as positive by the test = 4
* False Negatives (FN) = Number of HIV+ samples incorrectly identified as negative by the test = 15
Sensitivity = TP / (TP + FN) = 105 / (105
Troubleshooting QC Problems: Your QC has failed, what do you do next?Randox
Randox Quality Control's next 'Improving Laboratory Performance Through Quality Control' educational guide has been published with helpful tips that your laboratory can use in order to ensure it has effective troubleshooting procedures in place.
So you ran QC this morning and realised that one of your analytes has been flagged as 'out-of-control', what do you do next? Do you ignore the warning and continue patient testing, repeat the control until it's within range or do you halt patient testing and investigate the source of the error?
When it comes to troubleshooting QC errors, unfortunately there is no easy path to take. However, it's important that you have standard operating procedures in place, outlining what to do in the event of an out-of control error. Errors occur in laboratories all over the world. A lab with effective troubleshooting procedures in place will still have errors but will be able to detect them, quickly reducing their impact and reducing the risk of wasting both time and money.
Quality control and quality assurance are important for ensuring accurate lab results. Quality control involves regularly testing controls of known values to monitor a test's performance. Key quality control statistics include the mean and standard deviation, which are used to calculate control limits on a Levey-Jennings chart. Westgard rules provide standards for determining when a test is out of control based on control values. Quality assurance encompasses the overall programs and procedures that a lab follows to ensure accurate and reliable results. It has strategic, tactical, and operational levels.
In the continuous quality journey, Controlling laboratory Errors is an integral part & focusing on analytical, post-analytical process is the first step. Developing a reporting culture followed by thorough analysis and implementation of appropriate corrective, preventive actions is required.
Critical Care Research: Connection to PracticeAllina Health
1) The document discusses a critical care research program at Abbott Northwestern Hospital with the goals of conducting studies to improve patient outcomes, enhance quality of care, and reduce costs.
2) The program involves intensivists, hospitalists, and other clinical specialties conducting studies and presenting findings to improve practice.
3) Several ongoing studies are summarized that examine issues like postoperative monitoring, pulmonary ultrasound scoring, infection risks, and outcomes after procedures.
This document provides an overview of quality control in clinical biochemistry laboratories. It discusses that quality control aims to ensure test results are correct by minimizing errors. Errors can occur in the pre-analytical, analytical, and post-analytical phases. The pre-analytical phase, involving sample collection and handling, accounts for most errors. Laboratories use internal quality control methods like calibration, controls, and Levey-Jennings charts daily, as well as external quality assurance programs, to monitor performance and identify errors. Maintaining quality control is important for generating accurate, reliable test results.
This document outlines an agenda and case studies for a healthcare analytics bootcamp. The bootcamp will use healthcare data to develop machine learning solutions to predict heart disease and identify high-risk patients. Case Study 1 will involve exploratory data analysis of tuberculosis data to analyze global trends, hotspots, and mortality rates. Case Study 2 will use a heart disease screening dataset and logistic regression to build a model to predict heart disease risk and develop treatment plans for high-risk patients. The document discusses the types of structured and unstructured healthcare data, sources of data, and applications of machine learning in healthcare analytics.
This document discusses quality control, quality assurance, and quality assessment in medical laboratories. It defines key terms like quality control, quality assurance, and quality assessment. Quality control refers to analytical measurements used to assess data quality, while quality assurance is an overall management plan to ensure data integrity. Quality assessment determines the quality of results generated by evaluating internal and external quality programs. The document outlines quality assurance and quality control processes like standard operating procedures, equipment and reagent validation, personnel competency, and documentation. It also discusses error types, control chart interpretation, and Westgard rules for evaluating quality control results.
Understand what healthcare analytics is.
Identify the 5-stage Analytics Program Lifecycle (APL).
Understand how data analytics can be used in healthcare.
Check it on Experfy: https://www.experfy.com/training/courses/introduction-to-healthcare-analytics.
The document summarizes the results of a survey of Clinical Skills Trainers (CSTs) in London that was conducted to review service delivery models and identify areas for improvement. Key findings included:
- Job descriptions and sessional commitments varied between CSTs, with most having no formal job plan.
- Frequency of image review and feedback to screeners ranged from monthly to ad hoc.
- Training and equipment checking practices among CSTs were also inconsistent.
- Monitoring of CST workload was done informally by most respondents.
1) The webinar discusses the use of statistical process control (SPC) in healthcare delivery system research and quality improvement. SPC uses control charts to monitor processes and outcomes over time and distinguish normal variation from meaningful changes.
2) An example is provided of how SPC was used to monitor disease symptoms electronically during the 2002 Salt Lake City Olympics to detect bioterrorism attacks. Near real-time SPC of clinic data allowed quick identification of a significant increase in gastrointestinal issues from influenza.
3) SPC can also be applied at the patient, provider, and population levels. An example pilot study used SPC to monitor the effects of a clinical workflow redesign emphasizing the medical assistant's
The document discusses using data and analytics to drive improvements in healthcare. It outlines the components of a data-driven organization, including an enterprise data warehouse, metrics, predictive models, protocols, and governance. It also discusses how analytics can help healthcare providers transition to value-based payments by measuring quality, reducing variation, and eliminating waste. Specific examples are provided on how one healthcare system used data to reduce variation in spine care, lower bleeding complications after PCI procedures, identify drug cost opportunities in knee replacements, and lower supply costs for lumbar fusion procedures.
This document outlines a real-world knowledge translation approach used in Alberta, Canada to facilitate evidence-informed decision making about robot-assisted surgery (RAS). It describes establishing committees to guide the re-evaluation of RAS, identifying current RAS procedures and gaps in evidence, developing strategies for data collection and a training/credentialing process, commissioning an economic analysis, and engaging patients. The overall goal is to ensure RAS technologies are implemented responsibly based on accurate local data and with oversight of costs, outcomes, and impacts on the health system and population health needs.
IV Congresso Internacional CBA2017
Emerging Technologies and the Quality of Care
David W. Bates, MD, MSc, Chief, Division of General Internal Medicine, Brigham and Women’s Hospital, Past President, ISQua
The document summarizes lean process improvements implemented in a histopathology department. Broad aims were to share lean solutions, debate strengths/weaknesses, and network. The presentation describes implementing standardized work templates to reduce movement and errors, optimizing layout to reduce travel distance, and introducing a pull system for consultant reporting to better match workload to availability. Measurable outcomes included a 43% reduction in specimen checks, 92% of work reported within 7 days (up from 50%), and decreased consultant reporting time from 4.5 to 1.8 days.
The document summarizes lean process improvements implemented in a histopathology department. Broad aims included sharing lean solutions, debating strengths/weaknesses, and networking. Specific changes included:
1) Mapping processes to identify waste like movement and waiting, then standardizing work like using templates to reduce movement and errors.
2) Optimizing the laboratory layout to reduce distance traveled and support continuous specimen flow.
3) Introducing a "pull" system where consultants pull batches based on capacity rather than being "pushed" work, improving turnaround times.
These changes reduced non-value added time and waste, supporting the goals of faster reporting and improved patient care.
Emergency Department Throughput: Using DES as an effective tool for decision ...SIMUL8 Corporation
This document discusses using discrete event simulation (DES) to support decision making in emergency departments. DES allows modeling of dynamic patient flow and testing of "what if" scenarios. The document outlines best practices for setting up successful DES projects including defining objectives, gathering quality data, validating models, and including frontline staff. Case studies demonstrate how DES has been used at hospitals to evaluate options for capacity changes, process improvements, and reducing wait times.
How to improve patient flow in emergency and ambulatory care, pop up uni, 10a...NHS England
Expo is the most significant annual health and social care event in the calendar, uniting more NHS and care leaders, commissioners, clinicians, voluntary sector partners, innovators and media than any other health and care event.
Expo 15 returned to Manchester and was hosted once again by NHS England. Around 5000 people a day from health and care, the voluntary sector, local government, and industry joined together at Manchester Central Convention Centre for two packed days of speakers, workshops, exhibitions and professional development.
This year, Expo was more relevant and engaging than ever before, happening within the first 100 days of the new Government, and almost 12 months after the publication of the NHS Five Year Forward View. It was also a great opportunity to check on and learn from the progress of Greater Manchester as the area prepares to take over a £6 billion devolved health and social care budget, pledging to integrate hospital, community, primary and social care and vastly improve health and well-being.
More information is available online: www.expo.nhs.uk
Point of Care Testing for Enhancing Patient Centered Planned Care DeliveryPAFP
PAFP 2013 Regional Lecture Series
Session 1 - Northeast
Presenter: Linda Thomas-Hemak, MD
The Wright Center for Primary Care
Broadcast live through the PAFP Community.
October 2nd, 2013 12pm - 1pm
Part 3: Rare Disease Clinical Development – Strategies for Ensuring Endpoint ...Medpace
n this free webinar, Medpace partners with Michelle Eagle of ATOM International, a provider of CE training for clinical trials across the world, to discuss approaches and steps that can be taken to ensure quality and integrity.
This document outlines a problem with emergency department throughput times exceeding targets of 175 minutes on average. It proposes implementing a rapid improvement event to improve decision to discharge times for admitted and discharged patients, placing a mid-level provider in triage, improving diagnostic window times, and reducing patients who leave without being seen or against medical advice to help streamline the patient care process from arrival to departure. The goals are to have a positive impact on patients, staff, and the organization.
The document discusses the trend toward automating phase I clinical trials. Historically, phase I trials have been largely paper-based, which has led to challenges around volunteer recruitment, complex workflows, scheduling studies, ensuring sample integrity, and accurately capturing data. However, there is a growing trend toward automating phase I trials using solutions like Oracle's LabPas in order to address these challenges, improve efficiency, and help phase I clinics and sponsors better manage costs, quality, and speed of data delivery. The document outlines key benefits and features of LabPas for automating phase I trials.
This document discusses process redesign in healthcare settings through the use of health information technology. It begins by setting learning objectives around proposing process redesign strategies in healthcare to improve patient safety and efficiency. It then provides an overview of common software functions like practice management systems, laboratory information systems, imaging systems, and patient portals. Examples of workflows between these systems and electronic health records are described. The document concludes by presenting a scenario of a clinic seeking to implement an electronic health record and redesign its processes, asking questions about recommendations.
Scheduling - Elaine Kemp National Improvement Lead
NHSIQ Domain 3
Presentation from the Productive Endoscopy Workshop, Tuesday 15th October 2013 at Ambassadors Bloomsbury , London, WC1H 0HX
This meeting brought together teams from around the country, and embarked on creating and testing the productive endoscopy toolkit. The aim of the day is to allow time with your team for sharing of experiences and exchange of good practice, learn how to apply lean techniques and hear the impact of successfully implemented case studies.
The document discusses a study conducted on the implementation of a Hospital Information System (HIS) at the National Heart Institute in India. The study aimed to analyze changes in outpatient department workflows, perform data mapping between existing and new electronic systems, and enrich the electronic masters. Key findings included improved patient registration times and billing processes as well as faster information flow across departments after HIS implementation. The document also describes various components of developing and implementing the computerized HIS including system analysis, design, testing and evaluation.
Educational presentation for medical laboratory technologists on how to create a lean culture in their workplace to improve the healthcare service by minimizing waste and enhancing work effeciency. An example in this presentation is about minimizing patient's wait time in the laboratory reception area.
Speaker Presentation from U.S. News Healthcare of Tomorrow leadership summit, Nov. 1-3, 2017 in Washington, DC. Find out more about this forum at www.usnewshot.com.
Health Care: Cost Reductions through Data Insights - The Data Analysis GroupJames Karis
An overview of the cost reduction opportunities for a Health Care provider. These opportunities can be identified, quantified and optimised through data-driven insights. The slide pack also provides a strategic overview of how one would set up such a project within a large organisation, whilst mitigating patient-care concerns.
This document discusses effective strategies for collecting data for clinical audits. It explains that the aim of data collection is to measure performance against set standards. Key factors to consider when planning data collection include determining what data to collect, where to find it, who will collect it, and how it will be collected. Data can be collected prospectively or retrospectively depending on the topic of the audit. The document provides examples of data collection methods like reviewing patient records, questionnaires, and observation. It also discusses designing data collection forms, determining sample size, sampling methods, piloting the data collection process, analyzing both quantitative and qualitative data, and discussing the results.
4 practical ways eh rs use real time analysis to help providers and patientsCureMD
This document discusses how electronic health records (EHRs) can use real-time analysis to benefit providers and patients. It outlines three main applications: 1) clinical decision support that alerts doctors to potential adverse events or high-risk patients; 2) improvements to clinical workflow through automated tasks and real-time financial reporting; and 3) coding support through claims scrubbing to identify errors before submission. Overall, real-time analysis in EHRs allows providers to access up-to-date patient data and make more informed care decisions.
20131212 salford royal experience an epr 10 years on, implementing ep rs at...amirhannan
Madeleine Neve, IM & T lead at Salford Royal Hospital presents at Health 2.0 Manchester meeting. See http://www.htmc.co.uk/pages/pv.asp?p=htmc0519 to watch talk
9 Quality Management System_EAT G H 2021.pptxNagaraju94925
The facility conducts various quality assurance activities including internal assessments, medical audits, death audits, and prescription audits on a periodic basis. Key processes are mapped to identify non-value adding activities and areas for improvement. Feedback from patients and employees is collected through satisfaction surveys and analyzed, with action plans developed to address low scoring areas. Quality management is supported by documentation like a quality manual, standard operating procedures, and maintenance of documents and records. Continuous improvement is pursued through the plan-do-check-act cycle based on results of assessments, audits and analyzing inputs from stakeholders.
Similar to Practical application of simulation models at Careggi university hospital (20)
Presentazione dello speech tenuto da Manolo Garabini (Engineer of the E. Piaggio Center of the University of Pisa - QBRobotics srl) dal titolo "WRAPP-up: an autonomous dual-arm robot for logistics", durante il Decision Science Forum 2019, il più importante evento italiano sulla Scienza delle Decisioni.
Presentazione dello speech tenuto da Carmine Spagnuolo (Postdoctoral Research Fellow - Università degli Studi di Salerno/ ACT OR) dal titolo "Technology insights: Decision Science Platform", durante il Decision Science Forum 2019, il più importante evento italiano sulla Scienza delle Decisioni.
The experience of a multidisciplinary team in the early diagnosis of Alzheime...Decision Science Community
The document discusses the Modiag project which uses artificial intelligence and machine learning techniques to help with the early and accurate diagnosis of neurodegenerative diseases like Alzheimer's and Parkinson's. A multidisciplinary team is developing a web platform and integrated database to collect and analyze various data types from multiple sources using ML approaches. Preliminary results show ML can successfully classify patients using genomic data and identify important genes and pathways. Further work is needed to better organize and analyze other data types to improve diagnostic precision. The goal is to develop a more efficient diagnostic workflow to support precision medicine approaches for these complex diseases.
Punti ristoro: layout, analisi dei flussi e simulazione per una Customer Expe...Decision Science Community
Presentazione dello speech tenuto da Andrea Cartoccio
(Group Marketing & Innovation Director Italy - Elior Group) dal titolo "Punti ristoro: layout, analisi dei flussi e simulazione per una Customer Experience ottimizzata", durante il Decision Science Forum 2019, il più importante evento italiano sulla Scienza delle Decisioni.
Presentazione dello speech tenuto da Paola Caburlotto
(HR Strategy Advisor) dal titolo "New opportunities & challenges ", durante il Decision Science Forum 2019, il più importante evento italiano sulla Scienza delle Decisioni.
Presentazione dello speech tenuto da Carlo Filippi (Associate Professor of Operations Research - Università degli Studi di Brescia) dal titolo "Optimization of omni-channel distribution in the fashion industry", durante il Decision Science Forum 2019, il più importante evento italiano sulla Scienza delle Decisioni.
Presentazione dello speech tenuto da Claudia Beldon
(VP - Fashion & Luxury Industry at ACT Operations Research) dal titolo "Fashion and Luxury - From sell through to risk-based management ", durante il Decision Science Forum 2019, il più importante evento italiano sulla Scienza delle Decisioni.
Presentazione dello speech tenuto da Maurizio Catellani (VP Brand, Marketing and Communication Strategy - ACT Operations Research) dal titolo "Promotion, marketing campaigns and consumer behaviour : innovative approaches", durante il Decision Science Forum 2019, il più importante evento italiano sulla Scienza delle Decisioni.
Presentazione dello speech tenuto da Claudio Cantarelli (Logistics Director - Moncler) dal titolo "Warehouse Control Tower and Xray Image Recognition", durante il Decision Science Forum 2019, il più importante evento italiano sulla Scienza delle Decisioni.
This document discusses optimizing assistance for passengers with reduced mobility (PRM) at Rome Fiumicino Airport. It provides an overview of the airport's PRM processes, which involve over 300 operators handling more than 100 passenger flows. In 2018, over 50% of flights required at least one PRM assistance during peak periods. The document proposes using predictive analytics, simulation, and optimization to improve forecasting of PRM demand, evaluate impacts of scenarios on processes, and determine optimal resource allocations and schedules. The proposed solution would provide automated forecasting, scenario analysis, and decision support to help enhance assistance services for PRM passengers at the airport.
DSS can help management by simplifying data analysis so they can spend more time thinking strategically. Management should focus on understanding business needs rather than technical details. DSS also challenges beliefs by providing new insights from simulations. The presenter's organization took several approaches to using DSS, starting with optimizing delivery logistics and moving to softer processes. Some case studies showed DSS optimizing transport saturation, reducing defects in product displays, and investigating new picking methods. Key lessons are that DSS are a mindset, not just technology, and change management is critical to success. DSS can help enlarge strategic thinking while also improving operations.
OVS has grown from its first store opening in 1972 through brand transformation and acquisitions. It is now Italy's largest apparel retailer, with a centralized distribution center and supply chain platform serving over 1,300 stores in Italy and internationally. OVS is focusing on digital expansion and using data optimization and AI to improve forecasting, inventory allocation, and logistics to enhance customer service levels. The company's "One Stock" system aims to maximize sales and margins by predicting demand and optimizing inventory transfers between distribution centers, e-commerce, and stores. OVS sees continuing to improve forecasting accuracy and powering systems with external data as keys to further success.
Presentazione dello speech tenuto da Emanuele Carpanzano (Director of the Department of Innovative Technologies at the University of Applied Sciences and Arts of Southern Switzerland) dal titolo "Decision Science: a complex hybrid science", durante il Decision Science Forum 2019, il più importante evento italiano sulla Scienza delle Decisioni.
Analysis insight about a Flyball dog competition team's performanceroli9797
Insight of my analysis about a Flyball dog competition team's last year performance. Find more: https://github.com/rolandnagy-ds/flyball_race_analysis/tree/main
The Building Blocks of QuestDB, a Time Series Databasejavier ramirez
Talk Delivered at Valencia Codes Meetup 2024-06.
Traditionally, databases have treated timestamps just as another data type. However, when performing real-time analytics, timestamps should be first class citizens and we need rich time semantics to get the most out of our data. We also need to deal with ever growing datasets while keeping performant, which is as fun as it sounds.
It is no wonder time-series databases are now more popular than ever before. Join me in this session to learn about the internal architecture and building blocks of QuestDB, an open source time-series database designed for speed. We will also review a history of some of the changes we have gone over the past two years to deal with late and unordered data, non-blocking writes, read-replicas, or faster batch ingestion.
Enhanced Enterprise Intelligence with your personal AI Data Copilot.pdfGetInData
Recently we have observed the rise of open-source Large Language Models (LLMs) that are community-driven or developed by the AI market leaders, such as Meta (Llama3), Databricks (DBRX) and Snowflake (Arctic). On the other hand, there is a growth in interest in specialized, carefully fine-tuned yet relatively small models that can efficiently assist programmers in day-to-day tasks. Finally, Retrieval-Augmented Generation (RAG) architectures have gained a lot of traction as the preferred approach for LLMs context and prompt augmentation for building conversational SQL data copilots, code copilots and chatbots.
In this presentation, we will show how we built upon these three concepts a robust Data Copilot that can help to democratize access to company data assets and boost performance of everyone working with data platforms.
Why do we need yet another (open-source ) Copilot?
How can we build one?
Architecture and evaluation
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Discussion on Vector Databases, Unstructured Data and AI
https://www.meetup.com/unstructured-data-meetup-new-york/
This meetup is for people working in unstructured data. Speakers will come present about related topics such as vector databases, LLMs, and managing data at scale. The intended audience of this group includes roles like machine learning engineers, data scientists, data engineers, software engineers, and PMs.This meetup was formerly Milvus Meetup, and is sponsored by Zilliz maintainers of Milvus.
Learn SQL from basic queries to Advance queriesmanishkhaire30
Dive into the world of data analysis with our comprehensive guide on mastering SQL! This presentation offers a practical approach to learning SQL, focusing on real-world applications and hands-on practice. Whether you're a beginner or looking to sharpen your skills, this guide provides the tools you need to extract, analyze, and interpret data effectively.
Key Highlights:
Foundations of SQL: Understand the basics of SQL, including data retrieval, filtering, and aggregation.
Advanced Queries: Learn to craft complex queries to uncover deep insights from your data.
Data Trends and Patterns: Discover how to identify and interpret trends and patterns in your datasets.
Practical Examples: Follow step-by-step examples to apply SQL techniques in real-world scenarios.
Actionable Insights: Gain the skills to derive actionable insights that drive informed decision-making.
Join us on this journey to enhance your data analysis capabilities and unlock the full potential of SQL. Perfect for data enthusiasts, analysts, and anyone eager to harness the power of data!
#DataAnalysis #SQL #LearningSQL #DataInsights #DataScience #Analytics
Predictably Improve Your B2B Tech Company's Performance by Leveraging DataKiwi Creative
Harness the power of AI-backed reports, benchmarking and data analysis to predict trends and detect anomalies in your marketing efforts.
Peter Caputa, CEO at Databox, reveals how you can discover the strategies and tools to increase your growth rate (and margins!).
From metrics to track to data habits to pick up, enhance your reporting for powerful insights to improve your B2B tech company's marketing.
- - -
This is the webinar recording from the June 2024 HubSpot User Group (HUG) for B2B Technology USA.
Watch the video recording at https://youtu.be/5vjwGfPN9lw
Sign up for future HUG events at https://events.hubspot.com/b2b-technology-usa/
Natural Language Processing (NLP), RAG and its applications .pptxfkyes25
1. In the realm of Natural Language Processing (NLP), knowledge-intensive tasks such as question answering, fact verification, and open-domain dialogue generation require the integration of vast and up-to-date information. Traditional neural models, though powerful, struggle with encoding all necessary knowledge within their parameters, leading to limitations in generalization and scalability. The paper "Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks" introduces RAG (Retrieval-Augmented Generation), a novel framework that synergizes retrieval mechanisms with generative models, enhancing performance by dynamically incorporating external knowledge during inference.
The Ipsos - AI - Monitor 2024 Report.pdfSocial Samosa
According to Ipsos AI Monitor's 2024 report, 65% Indians said that products and services using AI have profoundly changed their daily life in the past 3-5 years.
2. • Reliable
• Saves time and resources
• Allows to test multiple scenarios and to
choose the best one
• There are references in scientific literature
that discrete event simulation is reliable even
in healthcare environment
• Used to analyze process modification without
risks and with low costs
WHY SIMULATION?
3. WHY SIMULATION?
• Applications focused primarily on management and
optimization of first aid, and operating rooms
• Logistic and bureaucratic services are also a component
of primary importance for the correct functioning of
the hospital
• The analysis of processes of this type is widely treated
in literature, although rarely in healthcare
• The management of a front office service has common
characteristics, regardless of the place and the specific
function
4. FRONT OFFICE PROBLEMS
• Increasing of workload
• Adding reception of diabetology clinics
• Increasing of waiting time
• Decrease of service quality
5. WORKLOAD
• Database data from 1/2/2016 to
24/11/2016
• 126850 accesses, approximately 600
per day
• 6 types of service, everyone with a
different time of service and daily
trend of accesses
7. WORKLOAD INCREASE
• Management decided to transfer
reception of diabetology clinics
• These patients have priority over
others
• Waiting time increased
significantly due to this
additional workload
8. • Incoming flow
• Daily and weekly trend of accesses
• Number of front office desks
• Service time
DATA ANALYSIS
11. DATA ANALYSIS
• The number of accesses increased by over
30% in the fourth quarter of 2017
compared to data extracted from the
database
• There are approximately 500 more weekly
accesses caused by diabetology
• Waiting time increased from 7 minutes e
20 seconds to 10 minutes e 37 seconds
13. PROCESS ANALYSIS
• Understand how the requests
are handled
• Precise analysis of each step of
the process
• There are services with higher
priority than others?
14. Delivery of laboratory exams
Delivery of diagnostic reports
Bookings
Request of medical reports
Delivery of medical reports
Urgencies
PROCESS ANALYSIS
SERVICES PROVIDED
15. PROCESS ANALYSIS
• Payment needed in some cases
• Patients can pay before or during the
service
• Lack of affordable data related with
payment
• We can’t find the precise number of
payment and how long does them last
16. PROCESS ANALYSIS
There are 2 services with a higher priority than
others:
• Urgencies have priority over all
• Diabetology have priority over other services
excepts Urgencies
18. DEVELOPMENT OF SIMULATION
MODEL
• The simulation model was developed
based on information gathered
through process and data analysis
• The model had to be modified several
times to correctly simulate the process
• Some features of the process were not
immediately identified
19. DEVELOPMENT OF SIMULATION MODEL
• The arrival of patients is
managed with schedules
based on the historical trend
of accesses
• Their path is determined with
attributes based on the
analysis of historical data
cliniche
Consegna cartelle
T ru e
F a l s e
no Info
Station 11
Station 12
Station 13
Route 8
Route 9
Route 10
diagnostica
Consegna referti
laboratorio
Consegna referti
cliniche
Richiesta cartelle
Urgenze
P renotazioni
A ltri utenti
A ttributi1
A ttributi2
A ttributi3
A ttributi4
A ttributi5
A ttributi6
Percorso
T ru e
F a l s e
Diabetologia A ttributi7
Nuovo flusso A ttributi8
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
20. DEVELOPMENT OF SIMULATION MODEL
Pagamento
Tondo n o n v a a l l o s p o rte l l o
Tr ue
Fals e
Altro
Pa g a m e n to e ffe ttu a to
Tr ue
Fals e
Fine
Sta ti o n 1 4
Sta ti o n 1 5
Sta ti o n 1 6
Sta ti o n 1 7
Sta ti o n 1 8
Ro u te 1 1
Ro u te 1 2
Sta ti o n 1 9 Ro u te 1 3
F i n e p ro c e d u raHo l d 2
O r iginal
Duplic at e
Separat e 8
Assign 48 Sta ti o n 2 0 Ro u te 1 4
Tr ue
Fals e
Pa g a m e n to p re s e rv i z i o
Sa l d o Fine saldo
p ro c e d u ra
Co m p l e ta m e n to
Procedura
Procedura bis
Sport ello
Pe rc o rs o 2
Tr ue
Fals e
Assign 83
Utent i in codaT i p o s e rv i z i o
Tr ue
Fals e
d i a b
F i n e p ro c e d u ra
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
• The model manages three
routes for users: counters,
payment and information
office
• The service time at the
counters is based on direct
observations
• Information office has not
been tested due to lack of
data
21. DEVELOPMENT OF SIMULATION MODEL
• Shared resources and sets of resources to simulate the front office
• Duplicate entity that enters the payment path, while the original one keeps
the resource occupied
• The model registers the number of people in queue and the waiting time
22. DEVELOPMENT OF SIMULATION
MODEL
• All parameters can be modified through
a simple Excel file
• A control dashboard has been
developed. It allows to easily display the
main indicators during the simulation
• The simulation model is easily usable
even by personnel not experienced in
using Arena
24. VALIDATION
• The most significant indicators were
used to verify the correct functioning
of the model
• Number of weekly accesses
• Waiting time of patients, both
globally and for each service
25. VALIDATION
Historical data Model Difference
Weekly accesses 3048 3048 0 %
Waiting time for delivery medical reports 7’41” 7’48” 1,5 %
Waiting time for delivery diagnostic reports 7’18” 7’01” 4 %
Waiting time for delivery laboratory exams 7’18” 7’00” 4,2 %
Waiting time for request medical reports 8’00” 8’18” 3,7 %
Waiting time for urgencies 1’11” 35” 50,6 %
Waiting time for booking 7’32” 7’42” 2,3 %
Total waiting time 7’20” 7’09” 2,6 %
26. VALIDATION
• The results of the comparison show that in
almost all cases the difference between
model predictions and real data is less than
5%
• The only case in which there is a significant
difference is for urgencies, which however
are numerically irrelevant
• Based on these evidences, the model was
considered properly functioning, and used
for process optimization
27. SIMULATION OF
SCENARIOS
• To optimize the process it is necessary to find
the right balance between resource use and
waiting time
• Each scenario is evaluated based on these two
indicators
• The final choice was made by mutual
agreement with the management in order to
reduce waiting times by using a number of
operators at front office that allows the
maintenance of back office operations
28. SIMULATION OF SCENARIOS
• After interviews with the operators and
the management, different scenarios
were hypothesized to find the best
solution
• The best of the tested scenarios provides
an average wait for patients of 4 minutes
and 13 seconds
• The actual data were compared with the
forecasts to test the correct functioning of
the model
30. SIMULATION OF SCENARIOS
• The scenario plans to use part of the
resources for a path dedicated only to
diabetology
• These operators serve other users only if
there are no patients awaiting diabetology
acceptance
• The others take care of the services that
were provided before the diabetology
transfer
31. IMPLEMENTATION
• The trial took place from 5 March to 12
May 2018
• To detect problems with implementation,
it was necessary to carry out direct
monitoring of the front office activity
throughout the period
32. IMPLEMENTATION
KPI October-December
2017
March 5th-May 12th
2018
Difference
Total waiting time 10:37 6:09 -42,1%
Waiting time of patients
without priority
11:26 6:29 -43,3%
Waiting time of
diabetology
3:54 2:59 -23,7%
33. IMPLEMENTATION
• New scenario implemented from May 14th
to July 7th 2018
• 1 operator less between 9:30 to 12:30 for
users without priority
• Direct monitoring shows that the program
is respected more precisely than the
previous one
34. FINAL RESULTS
KPI October-Decembre 2017 May 14th-July 7th 2018
Diabetology patients who wait more than 20 minutes 1,8% 0,2%
Patients without priority who wait more than 20 minutes 19,2% 2,9%
Diabetology patients who wait more than 10 minutes 10,8% 4,1%
Patients without priority who wait more than 10 minutes 47,4% 22,6%
Diabetology patients who wait less than 5 minutes 75% 81,5%
Patients without priority who wait less than 5 minutes 33% 54,5%
Maximum waiting time for diabetology 52:00 28:00
Maximum waiting time for patients without priority 49:00 42:00
35. FINAL RESULTS
KPI Forecast of model May 14th – July 7th 2018 Difference
Waiting time of patients
without priority 6:55 6:16 -9,6%
Waiting time of
diabetology 4:10 2:58 -28,9%
36. • Historical data of service and payment times are
not available in a database
• They have been estimated with direct monitoring
on a sample, so it is possible that there are
discrepancies with respect to the real value
• The service time for diabetology acceptance has
dropped significantly over time thanks to the
increased experience of the operators
• Direct monitoring shows that, despite the
improvement, the program is not always respected
FINAL RESULTS
EXPLANATION OF THE DIFFERENCES
37. FINAL RESULTS
• The application of the new organization has led
to a significant improvement of all performance
indicators of the process
• The model forecasts are close to the real data
• The differences can be explained objectively
38. CONCLUSIONS
• All the goals that were set at the beginning of
the project were achieved
• Reduction of patients waiting time
• Demonstration of correct functioning of the
model
• After these important results, the use of
simulation models has been extended to other
processes
39. FUTURE DEVELOPMENTS
• Use of simulation models for process analysis and optimization
• Reorganize and improve communication with patients
• Reduce waiting time at the call center
• Improve the management of the URP email service
• Extend the use of the tool to
healthcare processes
40. FUTURE DEVELOPMENTS
• A simulation model has already been
developed to handle calls to the call
center
• The validation gave satisfactory results
• The tool is expected to be used as part of
a vast operational and technological
reorganization of the service
Editor's Notes
Buongiorno, il mio nome è Duccio Cocchi e lavoro come assegnista di ricerca dell’Università di Firenze presso L’Azienda Ospedaliero Universitaria di Careggi. Prima di iniziare ad esporvi il nostro progetto ed i risultati ottenuti, vorrei ringraziare il Dott. Carpini, che ha creduto in questo progetto e che ha permesso di sviluppare negli ultimi anni l’utilizzo di Arena come software di simulazione all’interno dell’azienda ospedaliera, permettendo di migliorare i servizi ai pazienti e di allargare la nostra ricerca in ambiti sempre nuovi.
Perché abbiamo scelto di utilizzare la simulazione in ambiente ospedaliero? In primo luogo è stato ampiamente dimostrato che è una tecnica molto affidabile in ogni ambito in cui è stata usata fino ad oggi, partendo ovviamente dall’industria, per poi allargarsi al mondo dei servizi. Inoltre ha il vantaggio di far risparmiare tempo e risorse all’azienda che la utilizza, cosa particolarmente importante soprattutto in questo ambiente. Infatti, valutare una modifica organizzativa, che potrebbe portare a dei miglioramenti, testandola senza rischi e con costi bassi, è molto appetibile.
Anche se in numero minore rispetto ad applicazioni industriali, esistono molti esempi in letteratura dell’utilizzo di software di simulazione in sanità. Le problematiche affrontate più di frequente sono relative a processi strettamente sanitari, come il funzionamento del pronto soccorso o delle sale operatorie, che sono importantissimi per l’ospedale, sia dal punto di vista sanitario che finanziario. Tuttavia ci sono processi di tipo logistico e burocratico che impattano in maniera importante sulla qualità del servizio ai pazienti. Un esempio lampante è il processo oggetto del nostro studio, e cioè il centro servizi dell’ospedale Careggi. Il nostro lavoro si è concentrato sull’ottimizzazione dell’attività di front office, che sarà descritta più avanti nel dettaglio, utilizzando anche spunti presenti in letteratura presi da studi che hanno affrontato lo stesso problema, ma in ambienti diversi, considerando le caratteristiche comuni presenti.
Il motivo per cui ci siamo concentrati sul centro servizi è stato l’insorgere di alcuni problemi organizzativi, dovuti ad un aumento del carico di lavoro ed all’aggiunta di una nuova attività, che hanno portato all’aumento del tempo di attesa degli utenti e ad una grave riduzione della qualità del servizio, a cui la dirigenza ha deciso di porre rimedio attraverso lo sviluppo e l’utilizzo di un modello di simulazione. Lo scopo finale di questo progetto è quindi duplice: creare un modello di simulazione funzionante, dimostrando la validità dello strumento, e migliorare il servizio per gli utenti.
Come primo passo per comprendere le caratteristiche del processo, è stata effettuata un’analisi dei dati storici disponibili, per definire i parametri principali indispensabili per il funzionamento del modello. I dati estratti dal database coprono circa 10 mesi, con 126850 accessi in totale al front office, ed una media di circa 600 al giorno per i 6 servizi disponibili in quel periodo.
Come si può osservare in questo grafico il numero e l’andamento giornaliero degli accessi varia in maniera specifica per ognuno, spingendo fin dall’inizio dell’analisi per l’utilizzo, al momento dello sviluppo del modello di simulazione, di schedule di arrivo su base oraria calcolati singolarmente in base al tipo di richiesta dell’utente. Dal grafico si capisce inoltre come le ore centrali della mattina siano le più critiche, a causa dell’elevato carico di lavoro.
La decisione di aggiungere il servizio di accettazione degli ambulatori di diabetologia, i cui pazienti hanno la precedenza rispetto agli altri, ha portato ad un aumento del carico di lavoro che non era possibile gestire in maniera adeguata con l’organizzazione delle risorse umane precedentemente utilizzata, cosa che ha portato ad un aumento del tempo di attesa degli utenti.
Proseguendo con l’analisi dei dati, ci siamo concentrati sullo studio del flusso dei pazienti, con particolare attenzione al calcolo degli accessi nelle varie fasce orarie e nei giorni della settimana, poiché è emerso che il carico di lavoro presenta delle differenze significative in base all’orario ed al giorno. Altri parametri fondamentali sono il numero di risorse utilizzate per il front office ed il tempo di servizio in base al tipo di richiesta. Per quanto riguarda questi ultimi due parametri i dati storici purtroppo sono risultati mancanti, a causa della mancanza di un programma predefinito dei turni di lavoro agli sportelli in un caso, e per problemi con il software di gestione della coda nell’altro.
I problemi relativi alla gestione del carico di lavoro sono emersi subito in maniera chiara. Infatti, come potete osservare nel grafico, c’è una correlazione molto forte tra numero di accessi e tempo di attesa. La spiegazione di questo andamento è stata trovata osservando direttamente il processo, ed attraverso colloqui con il personale. Purtroppo, sottovalutando l’importanza dell’analisi dei flussi, non era prevista un’organizzazione precisa dei turni di lavoro, ma gli sportelli venivano aperti e chiusi in base a valutazioni immediate sull’aumento o la diminuzione del tempo di attesa, causando ovviamente un rapido aumento della coda nelle fasce orarie di maggiore afflusso, cioè dalle 9:30 alle 12:30.
In questo grafico si vede il confronto tra le risorse utilizzate ed il tempo di attesa, in cui è chiaro che la gestione delle risorse segue l’andamento del tempo di attesa, piuttosto che cercare di gestirlo. Per completezza tuttavia, bisogna sottolineare che questi dati hanno solo una valenza indicativa, poiché non sono del tutto affidabili per quanto riguarda l’effettivo utilizzo degli sportelli, che è stato accertato essere inferiore ai numeri qui riportati.
Proseguendo con l’analisi dei dati successiva al trasferimento del nuovo servizio di accettazione della diabetologia, rispetto ai record del 2016 estratti dal database, si può osservare un incremento significativo del numero di accessi, pari a circa il 30%. Questo aumento non è dovuto solo alla diabetologia, che comunque contribuisce con circa 500 accessi alla settimana, ma è imputabile anche ad un aumento generale del numero di utenti che in un primo tempo era stato sottovalutato. Questa situazione ha reso ancora più urgente un intervento di ottimizzazione dell’utilizzo delle risorse per il front office, soprattutto a causa del consistente aumento del tempo di attesa, e quindi dei disagi degli utenti.
Come già accennato in precedenza il carico di lavoro non è costante, gli accessi infatti tendono a diminuire durante la settimana, quindi i primi giorni, ed in particolare il lunedì, sono i più impegnativi dal punto di vista del carico di lavoro. Tuttavia, per evitare problemi nell’implementazione dei turni di lavoro, è stato deciso di testare con il modello un programma uguale per tutti i giorni della settimana, tranne il sabato in cui il centro servizi è aperto solo la mattina. Questo approccio è stato necessario per cercare di ridurre al minimo i problemi di adattamento per gli operatori, che comunque si sono presentati lo stesso in fase di implementazione.
Il passo successivo per lo sviluppo del modello di simulazione è quello di analizzare anche le caratteristiche e le peculiarità del processo oltre ai dati storici, per capire come le varie richieste vengono gestite, con particolare attenzione a tutte le attività necessarie, i relativi tempi e l’eventuale priorità di alcuni pazienti sugli altri.
In primo luogo sono stati individuati i servizi prestati al front office: consegna di esami di laboratorio, di diagnostica, prenotazioni di esami, richiesta e consegna di cartelle cliniche e urgenze. Come visto precedentemente, i dati relativi agli accessi sono disponibili nel database per ogni attività, al contrario il tempo di servizio non viene rilevato dal software di gestione della coda e quindi è stato necessario stimarlo con un monitoraggio diretto effettuato su un campione di utenti. Questa stima ha ovviamente dei problemi di affidabilità, ma è stata l’unica soluzione percorribile per procedere alla validazione del modello ed al suo successivo utilizzo.
Un altro parametro particolarmente complicato da stimare è quello relativo al tempo di pagamento. Per alcune delle prestazioni effettuate agli sportelli infatti, è necessario il pagamento di un ticket, che spesso avviene quando l’utente si trova allo sportello. Se avviene questo il paziente deve recarsi personalmente al totem, occupando al contempo lo sportello. Anche in questo caso sia il numero di pagamenti effettuati, sia i tempi di pagamento, sono stati stimati in maniera diretta, con i conseguenti problemi di affidabilità, per la mancanza di dati storici a cui affidarsi.
Per quanto riguarda la precedenza dei servizi, è emerso che tutte le prestazioni hanno la stessa priorità, eccetto le urgenze, che hanno precedenza su tutti, e l’accettazione degli ambulatori di diabetologia che hanno priorità su tutti tranne che sulle urgenze.
Terminata l’analisi dei dati e di processo, è iniziata la delicata fase di sviluppo del modello, con successiva validazione dello strumento, con l’obbiettivo finale di utilizzarlo nella pratica per ottimizzare il processo e migliorare il servizio per gli utenti.
La natura abbastanza complessa di alcuni passaggi procedurali ha portato a dei problemi in fase di sviluppo. In particolare, è stato necessario modificare in corso d’opera il modello, via via che emergevano nuove caratteristiche di gestione di alcune fasi del processo che inizialmente non erano state adeguatamente considerate o segnalate dagli operatori.
Andiamo ora ad analizzare nel dettaglio come vengono gestiti i vari passaggi nel modello di simulazione. L’arrivo dei pazienti è impostato con uno schedule su base oraria diverso per ogni servizio e basato sui dati storici del database, che sono i più affidabili. Il percorso che seguiranno le entità, che simula le caratteristiche del processo reale, è determinato attraverso degli attributi assegnati ad ogni entità in base all’analisi di processo e dei dati storici. Questi attributi, che possono essere modificati dall’utilizzatore, guidano l’entità quando questa passa attraverso i moduli di decisione.
Gli utenti possono seguire tre percorsi: front office, pagamento e ufficio informazioni. Quest’ultimo percorso non è stato attivato per mancanza di dati e perché al momento la sua analisi non è nell’interesse della dirigenza. Tuttavia, è stato inserito in previsione di una futura analisi di questo servizio, che è prevista nel prossimo futuro. Al momento comunque solo i due percorsi di front office e pagamento sono stati testati e utilizzati. Il tempo di servizio allo sportello, quello di pagamento ed il numero di utenti che utilizzano i totem per il pagamento del ticket, sono basati su osservazioni dirette, quindi sono meno affidabili dei dati sugli accessi estratti dal database.
Per simulare l’attività degli sportelli di front office sono stati utilizzati set di risorse e risorse condivise. In particolare gli sportelli sono divisi in due gruppi: uno riservato all’accettazione della diabetologia, che serve gli altri utenti solo se non ci sono persone in attesa, mentre i rimanenti sportelli sono riservati a tutti gli altri servizi. Per simulare un processo di questo tipo, sono stati riservati ai pazienti di diabetologia le omonime risorse, con priorità massima. Al contrario gli altri utenti condividono un set di risorse formato dagli sportelli riservati alla diabetologia più tutti gli altri, con priorità bassa. in questo modo l’accettazione della diabetologia viene effettuata solo negli sportelli riservati, che però possono essere utilizzati anche per altre attività se momentaneamente vuoti. Se un utente deve recarsi ad un totem per il pagamento di un ticket durante la procedura allo sportello, il modello crea un’entità duplicata che occupa la risorsa per il pagamento per il tempo previsto, mentre l’entità originale mantiene occupato lo sportello fino a che il pagamento non è concluso, simulando in maniera esatta la procedura reale. Il modello registra il numero di persone in coda ed il tempo di attesa, in modo da poter essere utilizzato per valutare le prestazioni del processo al variare dei parametri.
Proprio a questo fine, una delle caratteristiche interessanti del modello che è stato sviluppato è la facilità di utilizzo. Infatti, tutti i parametri, compresi gli accessi di ogni fascia oraria, le risorse, i tempi di servizio, e così via, possono essere modificati attraverso un file Excel, senza andare ad agire direttamente all’interno del modello, cosa che può risultare complicata a chi non è esperto dello strumento.
Inoltre, è stata sviluppata una dashboard per visualizzare i principali indicatori durante la simulazione. In questo modo si può avere un’idea dell’andamento della coda e dell’attesa nei vari giorni della settimana durante la simulazione, tuttavia il report finale è sicuramente lo strumento più indicato per avere dei valori affidabili basati su un numero elevato di repliche.
Ovviamente, prima di andare ad utilizzare il modello per riorganizzare il processo, è necessario procedere con la validazione. Per fare ciò sono stati scelti gli indicatori più significativi: il numero di accessi settimanali, per essere sicuri che gli schedule degli accessi siano corretti, ed il tempo di attesa degli utenti, sia per singolo servizio che a livello totale, per verificare la correttezza delle stime.
Come si può osservare nella tabella i dati sono molto positivi, con differenze contenute per quasi tutti i servizi. In particolare è interessante il numero di accessi settimanali, che dimostrano la correttezza degli schedule, ed il tempo di attesa totale, con una differenza minima tra stime del modello e dati storici.
L’unica differenza significativa è quella delle urgenze, che tuttavia sono numericamente irrilevanti rispetto al totale, ed hanno delle caratteristiche molto peculiari che ne rendono difficile una perfetta riproduzione. A parte questo le stime del modello sono molto vicine alla realtà, con differenze inferiori al 5%, e quindi il modello può essere considerato correttamente funzionante ed utilizzato.
Una volta completata la fase di sviluppo, e dimostrato il corretto funzionamento del modello, è il momento di procedere all’ottimizzazione del processo. L’obbiettivo finale è trovare un giusto equilibrio tra le risorse umane impiegate per l’attività di front office, e quelle necessarie per il back office, il tutto riducendo al minimo il tempo di attesa degli utenti. Per raggiungere lo scopo sono stati testati diversi scenari, con un numero diverso di sportelli nelle varie fasce orarie, per trovare la soluzione migliore.
Lo scenario con le migliori prestazioni è mostrato nella slide successiva, e prevede un tempo di attesa per gli utenti, sia con priorità che senza, di 4 minuti e 13 secondi. Comunque, nonostante la validazione abbia dato risultati molto positivi, solo una prova diretta può dare la conferma definitiva del corretto funzionamento del modello.
Il test è stato effettuato secondo le specifiche riportate nella tabella, dove si può osservare la divisione tra sportelli dedicati alla diabetologia e risorse senza priorità. In corrispondenza del picco degli accessi è concentrato il maggior numero di risorse, in modo da gestire con meno disagi possibile la situazione.
La gestione dei due percorsi, con o senza priorità, segue le caratteristiche precedentemente esposte, con gli sportelli della diabetologia che servono altri utenti solo se non c’è nessuno in attesa. Questa gestione degli sportelli è diversa rispetto al passato, quando non c’erano percorsi preferenziali, ma è stata individuata come la migliore soluzione per garantire il miglior servizio possibile a tutti gli utenti, salvaguardando la necessità di trattare con celerità l’accettazione della diabetologia, evitando così problemi con gli appuntamenti.
Il test di implementazione è stato effettuato tra il 5 marzo ed il 12 maggio 2018, portando avanti anche un monitoraggio diretto della situazione, per individuare eventuali problemi nell’attività di front office.
Come si può osservare i risultati mostrano un miglioramento netto della situazione rispetto all’ultimo trimestre del 2017. Tuttavia, è evidente che i risultati sono lontani dalla stima del modello, che prevedeva un tempo di attesa di poco superiore ai 4 minuti. Questo risultato insoddisfacente è dovuto alla necessità di utilizzare più risorse di quanto previsto per gestire l’attività di back office, riducendo le risorse per gli sportelli e quindi aumentando l’attesa. Questo problema è stato verificato direttamente durante il monitoraggio, soprattutto nei momenti di maggiore afflusso.
Prendendo atto di questa situazione, e di comune accordo con la dirigenza, è stato deciso di testare un nuovo scenario, con un operatore in meno nella fascia di maggiore afflusso, per venire incontro alle necessità degli operatori. La sperimentazione ha avuto luogo dal 14 maggio al 7 luglio 2018, ed il monitoraggio diretto ha confermato che il rispetto del programma in questo caso è stato molto più puntuale.
Riducendo il numero di risorse, ovviamente, c’è stato un ridimensionamento degli effetti benefici della riorganizzazione, risultati che sono comunque molto positivi, con tutti gli indicatori di prestazioni di tutti gli utenti che hanno avuto un miglioramento netto, come si può osservare nella tabella. In particolare il crollo del numero di utenti che aspettano più di 10 e di 20 minuti è il risultato più significativo, perché permette di ridurre notevolmente i disagi ed i reclami dei pazienti.
Comunque, i risultati più importanti, soprattutto per confermare il corretto funzionamento del modello, sono relativi al tempo di attesa dei pazienti. Come potete vedere, rispetto alle previsioni del modello ci sono alcune differenze, soprattutto per quanto riguarda la diabetologia, che comunque possono essere spiegate in maniera soddisfacente.
In primo luogo, bisogna tenere in considerazione che i tempi necessari per il pagamento dei ticket e per completare le pratiche allo sportello non sono disponibili da un database, e sono stati quindi stimati in base ad un monitoraggio effettuato su un campione ristretto, e quindi soggetti ad errore. A questo bisogna aggiungere che l’accettazione della diabetologia avviene con un software che gli operatori in un primo tempo non conoscevano. Quindi acquisendo esperienza il tempo di servizio è diminuito sensibilmente. Infine, anche il rispetto del programma da parte degli operatori non è stato sempre preciso, a causa della loro predisposizione a gestire il lavoro in maniera indipendente. Tutti questi fattori spiegano il perché della differenza tra dati reali e previsioni, soprattutto per quanto riguarda la diabetologia. I risultati sono comunque soddisfacenti, soprattutto per i pazienti senza priorità, che sono i più numerosi, e per i quali la differenza tra previsione del modello e realtà è inferiore al 10%.
Concludendo, possiamo affermare che l’utilizzo del modello di simulazione ha portato ad un netto miglioramento di tutti gli indicatori del processo, e quindi ad un sensibile miglioramento del servizio agli utenti. Le previsioni del modello si sono dimostrate molto vicine ai risultati reali, considerati tutti i fattori di incertezza precedentemente elencati, e quindi il corretto funzionamento dello strumento è stato dimostrato.
Tutti gli obbiettivi iniziali del progetto sono stati raggiunti. In particolare, il più importante è stata la riduzione del tempo di attesa degli utenti, nonostante il considerevole aumento del carico di lavoro. Inoltre è stata dimostrata la correttezza delle stime del modello, e quindi il suo funzionamento, aprendo la strada all’utilizzo di questo software anche ad altri processi all’interno dell’ospedale.
Proprio in relazione a questo argomento, tra i molti ambiti su cui ci stiamo concentrando al momento, sia in processi sanitari che non, la riorganizzazione e l’ottimizzazione delle comunicazioni con gli utenti è quello più importante. Soprattutto per quanto riguarda la riduzione dei tempi di attesa del call center, che al momento presentano dei problemi di prestazioni che portano a gravi disagi per gli utenti.
Un modello di simulazione è stato già sviluppato e testato sui dati storici disponibili, dando degli ottimi risultati. Al momento stiamo iniziando ad utilizzare lo strumento per valutare una serie di scenari in modo da definire un programma di riorganizzazione a livello tecnologico e operativo che permetta di migliorare il servizio, in maniera analoga a quanto già fatto per il centro servizi.