This study aimed to develop a methodology to accurately predict the time needed to complete a series of surgical cases in order to avoid overutilization of operating room time. The researchers analyzed data on 6,090 cardiac surgeries performed between 2004-2009. They fitted lognormal distributions to surgical times and developed a method based on the Fenton-Wilkinson approximation to estimate the total time of a scheduled series of cases. When tested on 95 actual schedules over 3 months in 2009, the methodology accurately predicted the risk of overtime in most cases and helped minimize overutilization of operating room time.
The document discusses quality assurance in healthcare, including defining quality, measuring it through indicators, improving quality through approaches like total quality management and continual improvement, and ensuring quality through principles like transparency, evidence-based practice, and accountability. It also addresses important dimensions of quality like safety, effectiveness, efficiency, accessibility, and patient-centeredness.
An Autocorrelation Analysis Approach to Detecting Land Cover Change using Hyp...grssieee
The document presents an approach to detecting land cover change using temporal autocorrelation analysis of hyper-temporal satellite imagery. The methodology develops an autocorrelation-based change metric to distinguish real changes from natural phenological cycles. The optimal autocorrelation lag and detection threshold are determined using simulated change datasets. When applied to MODIS imagery of settlements in Gauteng, South Africa, the approach detected over 92% of real changes with an overall accuracy of 88%.
Optimal Energy Storage System Operation for Peak ReductionDaisuke Kodaira
This document presents a study on using energy storage systems (ESS) for peak reduction on a distribution network. The key points are:
1. Two ESS batteries were installed and controlled remotely by the network operator to reduce peaks. Load and ESS schedules were optimized 24 hours ahead.
2. Accurate load prediction is challenging due to errors. Probabilistic prediction intervals (PIs) accounting for uncertainty were proposed to determine ESS schedules.
3. Different PI construction methods like sample base, confidence interval, and Chebyshev were evaluated. Confidence interval achieved the best yearly peak reduction while minimizing the coverage width-based criterion.
4. A modified objective function considering off-peak duration in
This document discusses incorporating dynamic electronic health record (EHR) data into a model to predict patient deterioration. It presents research on developing a risk prediction model using EHR data like vital signs, labs, and events. The author describes cleaning and partitioning the time-varying EHR data, recalibrating the National Early Warning Score (NEWS) model using Cox proportional hazards, and comparing model performance when additionally incorporating lab variables. Results show recalibrated NEWS and adding labs improves predictive performance, especially within the first 48 hours. Next steps proposed are incorporating changing clinical status and using a Bayesian non-parametric baseline hazard function.
Eric Katzen worked as an intern at NASA's Hazardous Gas Detection Laboratory (HGDL) under the mentorship of electrical engineer Reggie Martin. Some of his responsibilities included developing leak test equipment, procedures, and calibrating helium mass spectrometers. Specifically, he developed a Leak Tester Cart to test for leak rates, amended HGDL procedures to conform with industry standards, calibrated helium mass spectrometers, tested fittings for maximum leak rates, and created a webpage organizing leak detection specifications and standards. Through his work, he gained valuable experience in leak detection techniques and helping improve HGDL testing operations.
This document describes research using machine learning classifiers applied to early frames of amyloid PET scans to measure amyloid burden and status. The key points are:
1. Classifiers were developed using data from the first 20 minutes of amyloid scans on 107 subjects and able to reliably distinguish amyloid positive from negative status with 100% accuracy based on canonical variate scores.
2. A second classifier was able to quantify amyloid burden into five levels that correlated with standardized uptake value ratios measured from late scan frames, demonstrating the early frames can provide a measure of amyloid levels.
3. Using the discrete early time frames as subclasses improved the classifiers' ability to dissociate amyloid burden from other signals compared to using averaged early frames, allowing measurement
New Concept to Adjusment Sw Model for Gas/il Reservoir in Transition ZoneFaisal Al-Jenaibi
This document presents a new "RRT's Ratio Concept" approach for building a water saturation model that honors well log data and smoothly transitions near oil-water contacts and free water levels. The concept involves:
1. Classifying the reservoir into groups based on flow zone indicators and permeability ranges.
2. Distributing well log water saturation data throughout the model weighted by the RRT groups.
3. Adjusting saturation values near contacts using a "COR" parameter to introduce a gradual transition between oil and water zones.
4. Iteratively repeating the adjustment to improve the saturation trend near contacts within each RRT group.
This concept has been tested successfully on multiple models and allows matching well log
This document presents a method for liver segmentation using a 2D U-Net model. The method first preprocesses CT scan images using techniques like windowing, masking, normalization, and morphological operations. It then trains a 2D U-Net model on slices containing liver regions only, in order to focus modeling on liver segmentation. To improve inference performance on full volumes which may contain non-liver slices, the method uses histogram analysis to roughly select a range of slices likely to contain liver, such as the central 40% of slices. Evaluation shows this rough selection approach improves segmentation accuracy compared to applying the model to full volumes directly.
The document discusses quality assurance in healthcare, including defining quality, measuring it through indicators, improving quality through approaches like total quality management and continual improvement, and ensuring quality through principles like transparency, evidence-based practice, and accountability. It also addresses important dimensions of quality like safety, effectiveness, efficiency, accessibility, and patient-centeredness.
An Autocorrelation Analysis Approach to Detecting Land Cover Change using Hyp...grssieee
The document presents an approach to detecting land cover change using temporal autocorrelation analysis of hyper-temporal satellite imagery. The methodology develops an autocorrelation-based change metric to distinguish real changes from natural phenological cycles. The optimal autocorrelation lag and detection threshold are determined using simulated change datasets. When applied to MODIS imagery of settlements in Gauteng, South Africa, the approach detected over 92% of real changes with an overall accuracy of 88%.
Optimal Energy Storage System Operation for Peak ReductionDaisuke Kodaira
This document presents a study on using energy storage systems (ESS) for peak reduction on a distribution network. The key points are:
1. Two ESS batteries were installed and controlled remotely by the network operator to reduce peaks. Load and ESS schedules were optimized 24 hours ahead.
2. Accurate load prediction is challenging due to errors. Probabilistic prediction intervals (PIs) accounting for uncertainty were proposed to determine ESS schedules.
3. Different PI construction methods like sample base, confidence interval, and Chebyshev were evaluated. Confidence interval achieved the best yearly peak reduction while minimizing the coverage width-based criterion.
4. A modified objective function considering off-peak duration in
This document discusses incorporating dynamic electronic health record (EHR) data into a model to predict patient deterioration. It presents research on developing a risk prediction model using EHR data like vital signs, labs, and events. The author describes cleaning and partitioning the time-varying EHR data, recalibrating the National Early Warning Score (NEWS) model using Cox proportional hazards, and comparing model performance when additionally incorporating lab variables. Results show recalibrated NEWS and adding labs improves predictive performance, especially within the first 48 hours. Next steps proposed are incorporating changing clinical status and using a Bayesian non-parametric baseline hazard function.
Eric Katzen worked as an intern at NASA's Hazardous Gas Detection Laboratory (HGDL) under the mentorship of electrical engineer Reggie Martin. Some of his responsibilities included developing leak test equipment, procedures, and calibrating helium mass spectrometers. Specifically, he developed a Leak Tester Cart to test for leak rates, amended HGDL procedures to conform with industry standards, calibrated helium mass spectrometers, tested fittings for maximum leak rates, and created a webpage organizing leak detection specifications and standards. Through his work, he gained valuable experience in leak detection techniques and helping improve HGDL testing operations.
This document describes research using machine learning classifiers applied to early frames of amyloid PET scans to measure amyloid burden and status. The key points are:
1. Classifiers were developed using data from the first 20 minutes of amyloid scans on 107 subjects and able to reliably distinguish amyloid positive from negative status with 100% accuracy based on canonical variate scores.
2. A second classifier was able to quantify amyloid burden into five levels that correlated with standardized uptake value ratios measured from late scan frames, demonstrating the early frames can provide a measure of amyloid levels.
3. Using the discrete early time frames as subclasses improved the classifiers' ability to dissociate amyloid burden from other signals compared to using averaged early frames, allowing measurement
New Concept to Adjusment Sw Model for Gas/il Reservoir in Transition ZoneFaisal Al-Jenaibi
This document presents a new "RRT's Ratio Concept" approach for building a water saturation model that honors well log data and smoothly transitions near oil-water contacts and free water levels. The concept involves:
1. Classifying the reservoir into groups based on flow zone indicators and permeability ranges.
2. Distributing well log water saturation data throughout the model weighted by the RRT groups.
3. Adjusting saturation values near contacts using a "COR" parameter to introduce a gradual transition between oil and water zones.
4. Iteratively repeating the adjustment to improve the saturation trend near contacts within each RRT group.
This concept has been tested successfully on multiple models and allows matching well log
This document presents a method for liver segmentation using a 2D U-Net model. The method first preprocesses CT scan images using techniques like windowing, masking, normalization, and morphological operations. It then trains a 2D U-Net model on slices containing liver regions only, in order to focus modeling on liver segmentation. To improve inference performance on full volumes which may contain non-liver slices, the method uses histogram analysis to roughly select a range of slices likely to contain liver, such as the central 40% of slices. Evaluation shows this rough selection approach improves segmentation accuracy compared to applying the model to full volumes directly.
Paper and pencil_cosmological_calculatorSérgio Sacani
The document describes a paper-and-pencil cosmological calculator designed for the ΛCDM cosmological model. The calculator contains nomograms (graphs) for quantities like redshift, distance, size, age, and more for different redshift intervals up to z=20. It is based on cosmological parameters from the Planck mission of H0=67.15 km/s/Mpc, ΩΛ=0.683, and Ωm=0.317. To use the calculator, the user finds a known value and reads off other quantities at the same horizontal level.
IOSR Journal of Pharmacy (IOSRPHR), www.iosrphr.org, call for paper, research...iosrphr_editor
This study compared the efficacy of esmolol and nitroglycerine in creating a dry operative field during spinal surgeries through controlled hypotension. 50 patients undergoing spinal surgery were divided into two groups, with one group receiving an esmolol infusion and the other receiving a nitroglycerine infusion to lower blood pressure. Both drugs were effective at achieving the target hypotension of 60-65 mmHg. However, esmolol more significantly lowered heart rate compared to nitroglycerine. Both drugs created comparable dry operative fields based on surgeon ratings, though esmolol required slightly lower blood pressure levels. Nitroglycerine was found to be more cost-effective than esmolol for inducing controlled hypotension during these
This document discusses the high-level operations of the clinical pathology laboratory at Changi General Hospital. It provides statistics on the hospital and increasing patient load. It also describes the laboratory's efforts to improve workflow challenges and ensure fast and accurate test turnaround times. This is achieved through streamlining pre-analytics, analytics, and post-analytics processes. Specifically, the laboratory focuses on batching and automation, stat test prioritization, extensive quality control, and automatic result verification to provide quality patient care.
Test Scheduling of Core Based SOC Using Greedy AlgorithmIJERA Editor
Escalating increase in the level of integration has led the design engineers to embed the pre-design and pre-verified logic blocks on chip to make a complete system on chip (SoC) technology. This advancing technology trend has led to new challenges for the design and test engineers. To ensure the testability of the entire system, the test planning needs to be done during design phase. To save the test cost, the test application time needs to be reduced which requires the test to be done concurrently. However the parallel running of test of multiple cores increases the power dissipation. This thereby leads to make test optimization to take care of time and power. This paper presents an approach for the scheduling the cores with the test time, power, test access mechanism and bandwidth constraint based on greedy algorithm. The TAM allotment to the various cores is done dynamically to save the test time and utilize the full bandwidth. Scheduling is done on ITC’02 benchmark circuits. Experiments on these ITC’02 benchmark circuits show that this algorithm offers lower test application time compared to the multiple constraint driven system-on-chip.
This document describes a system identification project for a hard disk drive servosystem. The goal is to identify the high-order system model using two different methods: sine sweep and average ETFE. For the sine sweep method, increasing the input magnitude improves the estimated model accuracy. With an input magnitude of 1010 and increased frequency resolution, the estimated bode plot matches the true model well. For the average ETFE method, increasing the number of experiments from 50 to 2000 also improves the estimated model accuracy such that it closely matches the true bode plot.
Advantages of the self organizing controller for high-pressure sterilization ...ISA Interchange
A study of a self-organizing controller is implemented in a way that response to controlled system follows the desired given by the model. The self-organizing controller has proven to be a valuable tool in sterilization equipment in order to verify the capacity of the response to any change in the pressure or temperature. Basically, this type of controller is based on the Self-Organizing Map (SOM) that is a neural network algorithm of unsupervised learning. The new ideas include clustering visualization, interactive training and one-dimension arrays.
An improved approach to minimize context switching in round robin scheduling ...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
This document summarizes a thesis that investigates using an artificial neural network and Monte Carlo optimization to develop a generalized reservoir release model. The neural network model uses time-lagged inputs of inflow and demand to improve performance over previous three-input models. Monte Carlo optimization is used to generate synthetic inflow and optimize reservoir operations, training the neural network. The performance of the neural network model is evaluated across different reservoir sizes and deficit levels and compared to standard operating procedures.
Online Response Time Optimization of Apache Web Serverwebhostingguy
This document proposes and evaluates several online optimization techniques to minimize response times for the Apache web server by dynamically adjusting the MaxClients parameter. It finds that MaxClients has a concave upward effect on response time, making hill climbing techniques suitable. Three approaches are investigated: 1) Newton's method, 2) fuzzy control, and 3) a heuristic based on bottleneck utilization. Newton's method yields inconsistent results due to variable response times, fuzzy control converges slowly, and the heuristic works well but may not generalize. Overall, online optimization reduces response times by over an order of magnitude compared to static default settings.
This document validates the WindSight mesoscale wind modeling system. WindSight uses the WRF model to downscale global weather data and simulate wind conditions without pre-calculated data. Validation against measurements at 34 European sites found average errors below 1 m/s for 80% of sites. Wind roses and histograms also showed good agreement. The results prove WindSight useful for early-stage wind resource assessment and energy estimation when local data is limited.
IRJET- Retinal Blood Vessel Tree Segmentation using Fast Marching MethodIRJET Journal
1) The document describes a study that used the Fast Marching Method (FMM) to segment retinal blood vessels from fundus images.
2) The FMM algorithm was validated using two public datasets, DRIVE and STARE, achieving segmentation accuracy of 93% on DRIVE images within 5-10 minutes and 90% accuracy on STARE images within 15 minutes.
3) By comparing FMM to other techniques like matched filters, the results showed FMM performance was close to higher resolution methods and overcame some other techniques.
Hyperspectral unmixing using novel conversion model.pptgrssieee
The document presents a novel hyperspectral unmixing approach called uccm-SVM that converts the abundance quantification problem into a classification problem using support vector machines. The approach is tested on both simulated and real hyperspectral images and is shown to outperform traditional mean-based techniques like FCLS in terms of accuracy while having lower computational costs for smaller training set sizes. Future work to improve the method includes enhancing performance while reducing computation for larger training sets.
Virtual private networks (VPN) provide remotely secure connection for clients to exchange information with company networks. This paper deals with Site-to-site IPsec-VPN that connects the company intranets. IPsec-VPN network is implemented with security protocols for key management and exchange, authentication and integrity using GNS3 Network simulator. The testing and verification analyzing of data packets is done using both PING tool and Wireshark to ensure the encryption of data packets during data exchange between different sites belong to the same company.
The performance of an algorithm can be improved using a parallel computing programming approach. In this study, the performance of bubble sort algorithm on various computer specifications has been applied. Experimental results have shown that parallel computing programming can save significant time performance by 61%-65% compared to serial computing programming.
This document discusses how to calculate standard time (ST) for a manufacturing process. ST is the time it takes a qualified worker to complete a specific job accounting for stress and delays. Calculating ST involves observing the process, timing each step, analyzing the data, calculating normal time by rating observed times, and determining ST by adding allowances to normal time. ST is useful for costing, performance comparisons, planning, and process improvements. An example is provided showing the full calculation of ST for three operations: sawing, drilling, and welding.
Application of Survival Data Analysis- Introduction and Discussion (存活数据分析及应用- 简介和讨论), will give an overview of survival data analysis, including parametric and non-parametric approaches and proportional hazard model, providing a real life example of survival data-based field return analysis. Several common issues in survival data analysis will also be discussed.
APPLICATION OF PARTICLE SWARM OPTIMIZATION FOR ENHANCED CYCLIC STEAM STIMULAT...Zac Darcy
Three different variations of PSO algorithms, i.e. Canonical, Gaussian Bare-bone and Lévy Bare-bone
PSO, are tested to optimize the ultimate oil recovery of a large heavy oil reservoir. The performance of
these algorithms was compared in terms of convergence behaviour and the final optimization results. It is
found that, in general, all three types of PSO methods are able to improve the objective function. The best
objective function is found by using the Canonical PSO, while the other two methods give similar results.
The Gaussian Bare-bone PSO may picks positions that are far away from the optimal solution. The Lévy
Bare-bone PSO has similar convergence behaviour as the Canonical PSO. For the specific optimization
problem investigated in this study, it is found that the temperature of the injection steam, CO2 composition
in the injection gas, and the gas injection rates have bigger impact on the objective function, while steam
injection rate and the liquid production rate have less impact on the objective function.
Optic Disk and Retinal Vesssel Segmentation in Fundus ImagesIRJET Journal
This document presents research on segmenting the optic disk and retinal vessels in fundus images. It describes an algorithm that first applies anisotropic diffusion filtering and thresholding to enhance and extract the retinal vessel network. A graph cut method is then used to segment the optic disk, using either an MRF image reconstruction method or compensation factor method to address overlap of vessels into the optic disk region. The MRF method reconstructs the image in the optic disk region to remove vessels before segmentation, while the compensation factor incorporates local intensity characteristics of vessels. Experimental results on fundus images demonstrate the algorithm can accurately segment the optic disk and retinal vessels.
Optimal Round Robin CPU Scheduling Algorithm using Manhattan Distance IJECEIAES
In Round Robin Scheduling the time quantum is fixed and then processes are scheduled such that no process get CPU time more than one time quantum in one go. The performance of Round robin CPU scheduling algorithm is entirely dependent on the time quantum selected. If time quantum is too large, the response time of the processes is too much which may not be tolerated in interactive environment. If time quantum is too small, it causes unnecessarily frequent context switch leading to more overheads resulting in less throughput. In this paper a method using Manhattan distance has been proposed that decides a quantum value. The computation of the time quantum value is done by the distance or difference between the highest burst time and lowest burst time. The experimental analysis also shows that this algorithm performs better than RR algorithm and by reducing number of context switches, reducing average waiting time and also the average turna round time.
This document provides a status update on ServiceOntario's implementation of recommendations from a 2013 audit. It finds that ServiceOntario has fully implemented 9 of 21 recommendations aimed at improving service delivery and reducing costs. These include reducing the number of in-person service centres, implementing more efficient staffing mixes, and expanding less costly privately-run centres. Progress has been made on 6 more recommendations, while little progress was made on 3 recommendations and 2 will not be implemented. The status of each recommendation is detailed in the document.
Paper and pencil_cosmological_calculatorSérgio Sacani
The document describes a paper-and-pencil cosmological calculator designed for the ΛCDM cosmological model. The calculator contains nomograms (graphs) for quantities like redshift, distance, size, age, and more for different redshift intervals up to z=20. It is based on cosmological parameters from the Planck mission of H0=67.15 km/s/Mpc, ΩΛ=0.683, and Ωm=0.317. To use the calculator, the user finds a known value and reads off other quantities at the same horizontal level.
IOSR Journal of Pharmacy (IOSRPHR), www.iosrphr.org, call for paper, research...iosrphr_editor
This study compared the efficacy of esmolol and nitroglycerine in creating a dry operative field during spinal surgeries through controlled hypotension. 50 patients undergoing spinal surgery were divided into two groups, with one group receiving an esmolol infusion and the other receiving a nitroglycerine infusion to lower blood pressure. Both drugs were effective at achieving the target hypotension of 60-65 mmHg. However, esmolol more significantly lowered heart rate compared to nitroglycerine. Both drugs created comparable dry operative fields based on surgeon ratings, though esmolol required slightly lower blood pressure levels. Nitroglycerine was found to be more cost-effective than esmolol for inducing controlled hypotension during these
This document discusses the high-level operations of the clinical pathology laboratory at Changi General Hospital. It provides statistics on the hospital and increasing patient load. It also describes the laboratory's efforts to improve workflow challenges and ensure fast and accurate test turnaround times. This is achieved through streamlining pre-analytics, analytics, and post-analytics processes. Specifically, the laboratory focuses on batching and automation, stat test prioritization, extensive quality control, and automatic result verification to provide quality patient care.
Test Scheduling of Core Based SOC Using Greedy AlgorithmIJERA Editor
Escalating increase in the level of integration has led the design engineers to embed the pre-design and pre-verified logic blocks on chip to make a complete system on chip (SoC) technology. This advancing technology trend has led to new challenges for the design and test engineers. To ensure the testability of the entire system, the test planning needs to be done during design phase. To save the test cost, the test application time needs to be reduced which requires the test to be done concurrently. However the parallel running of test of multiple cores increases the power dissipation. This thereby leads to make test optimization to take care of time and power. This paper presents an approach for the scheduling the cores with the test time, power, test access mechanism and bandwidth constraint based on greedy algorithm. The TAM allotment to the various cores is done dynamically to save the test time and utilize the full bandwidth. Scheduling is done on ITC’02 benchmark circuits. Experiments on these ITC’02 benchmark circuits show that this algorithm offers lower test application time compared to the multiple constraint driven system-on-chip.
This document describes a system identification project for a hard disk drive servosystem. The goal is to identify the high-order system model using two different methods: sine sweep and average ETFE. For the sine sweep method, increasing the input magnitude improves the estimated model accuracy. With an input magnitude of 1010 and increased frequency resolution, the estimated bode plot matches the true model well. For the average ETFE method, increasing the number of experiments from 50 to 2000 also improves the estimated model accuracy such that it closely matches the true bode plot.
Advantages of the self organizing controller for high-pressure sterilization ...ISA Interchange
A study of a self-organizing controller is implemented in a way that response to controlled system follows the desired given by the model. The self-organizing controller has proven to be a valuable tool in sterilization equipment in order to verify the capacity of the response to any change in the pressure or temperature. Basically, this type of controller is based on the Self-Organizing Map (SOM) that is a neural network algorithm of unsupervised learning. The new ideas include clustering visualization, interactive training and one-dimension arrays.
An improved approach to minimize context switching in round robin scheduling ...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
This document summarizes a thesis that investigates using an artificial neural network and Monte Carlo optimization to develop a generalized reservoir release model. The neural network model uses time-lagged inputs of inflow and demand to improve performance over previous three-input models. Monte Carlo optimization is used to generate synthetic inflow and optimize reservoir operations, training the neural network. The performance of the neural network model is evaluated across different reservoir sizes and deficit levels and compared to standard operating procedures.
Online Response Time Optimization of Apache Web Serverwebhostingguy
This document proposes and evaluates several online optimization techniques to minimize response times for the Apache web server by dynamically adjusting the MaxClients parameter. It finds that MaxClients has a concave upward effect on response time, making hill climbing techniques suitable. Three approaches are investigated: 1) Newton's method, 2) fuzzy control, and 3) a heuristic based on bottleneck utilization. Newton's method yields inconsistent results due to variable response times, fuzzy control converges slowly, and the heuristic works well but may not generalize. Overall, online optimization reduces response times by over an order of magnitude compared to static default settings.
This document validates the WindSight mesoscale wind modeling system. WindSight uses the WRF model to downscale global weather data and simulate wind conditions without pre-calculated data. Validation against measurements at 34 European sites found average errors below 1 m/s for 80% of sites. Wind roses and histograms also showed good agreement. The results prove WindSight useful for early-stage wind resource assessment and energy estimation when local data is limited.
IRJET- Retinal Blood Vessel Tree Segmentation using Fast Marching MethodIRJET Journal
1) The document describes a study that used the Fast Marching Method (FMM) to segment retinal blood vessels from fundus images.
2) The FMM algorithm was validated using two public datasets, DRIVE and STARE, achieving segmentation accuracy of 93% on DRIVE images within 5-10 minutes and 90% accuracy on STARE images within 15 minutes.
3) By comparing FMM to other techniques like matched filters, the results showed FMM performance was close to higher resolution methods and overcame some other techniques.
Hyperspectral unmixing using novel conversion model.pptgrssieee
The document presents a novel hyperspectral unmixing approach called uccm-SVM that converts the abundance quantification problem into a classification problem using support vector machines. The approach is tested on both simulated and real hyperspectral images and is shown to outperform traditional mean-based techniques like FCLS in terms of accuracy while having lower computational costs for smaller training set sizes. Future work to improve the method includes enhancing performance while reducing computation for larger training sets.
Virtual private networks (VPN) provide remotely secure connection for clients to exchange information with company networks. This paper deals with Site-to-site IPsec-VPN that connects the company intranets. IPsec-VPN network is implemented with security protocols for key management and exchange, authentication and integrity using GNS3 Network simulator. The testing and verification analyzing of data packets is done using both PING tool and Wireshark to ensure the encryption of data packets during data exchange between different sites belong to the same company.
The performance of an algorithm can be improved using a parallel computing programming approach. In this study, the performance of bubble sort algorithm on various computer specifications has been applied. Experimental results have shown that parallel computing programming can save significant time performance by 61%-65% compared to serial computing programming.
This document discusses how to calculate standard time (ST) for a manufacturing process. ST is the time it takes a qualified worker to complete a specific job accounting for stress and delays. Calculating ST involves observing the process, timing each step, analyzing the data, calculating normal time by rating observed times, and determining ST by adding allowances to normal time. ST is useful for costing, performance comparisons, planning, and process improvements. An example is provided showing the full calculation of ST for three operations: sawing, drilling, and welding.
Application of Survival Data Analysis- Introduction and Discussion (存活数据分析及应用- 简介和讨论), will give an overview of survival data analysis, including parametric and non-parametric approaches and proportional hazard model, providing a real life example of survival data-based field return analysis. Several common issues in survival data analysis will also be discussed.
APPLICATION OF PARTICLE SWARM OPTIMIZATION FOR ENHANCED CYCLIC STEAM STIMULAT...Zac Darcy
Three different variations of PSO algorithms, i.e. Canonical, Gaussian Bare-bone and Lévy Bare-bone
PSO, are tested to optimize the ultimate oil recovery of a large heavy oil reservoir. The performance of
these algorithms was compared in terms of convergence behaviour and the final optimization results. It is
found that, in general, all three types of PSO methods are able to improve the objective function. The best
objective function is found by using the Canonical PSO, while the other two methods give similar results.
The Gaussian Bare-bone PSO may picks positions that are far away from the optimal solution. The Lévy
Bare-bone PSO has similar convergence behaviour as the Canonical PSO. For the specific optimization
problem investigated in this study, it is found that the temperature of the injection steam, CO2 composition
in the injection gas, and the gas injection rates have bigger impact on the objective function, while steam
injection rate and the liquid production rate have less impact on the objective function.
Optic Disk and Retinal Vesssel Segmentation in Fundus ImagesIRJET Journal
This document presents research on segmenting the optic disk and retinal vessels in fundus images. It describes an algorithm that first applies anisotropic diffusion filtering and thresholding to enhance and extract the retinal vessel network. A graph cut method is then used to segment the optic disk, using either an MRF image reconstruction method or compensation factor method to address overlap of vessels into the optic disk region. The MRF method reconstructs the image in the optic disk region to remove vessels before segmentation, while the compensation factor incorporates local intensity characteristics of vessels. Experimental results on fundus images demonstrate the algorithm can accurately segment the optic disk and retinal vessels.
Optimal Round Robin CPU Scheduling Algorithm using Manhattan Distance IJECEIAES
In Round Robin Scheduling the time quantum is fixed and then processes are scheduled such that no process get CPU time more than one time quantum in one go. The performance of Round robin CPU scheduling algorithm is entirely dependent on the time quantum selected. If time quantum is too large, the response time of the processes is too much which may not be tolerated in interactive environment. If time quantum is too small, it causes unnecessarily frequent context switch leading to more overheads resulting in less throughput. In this paper a method using Manhattan distance has been proposed that decides a quantum value. The computation of the time quantum value is done by the distance or difference between the highest burst time and lowest burst time. The experimental analysis also shows that this algorithm performs better than RR algorithm and by reducing number of context switches, reducing average waiting time and also the average turna round time.
Similar to Prediction of the time to complete a series of surgical cases to avoid OR overutilization (20)
This document provides a status update on ServiceOntario's implementation of recommendations from a 2013 audit. It finds that ServiceOntario has fully implemented 9 of 21 recommendations aimed at improving service delivery and reducing costs. These include reducing the number of in-person service centres, implementing more efficient staffing mixes, and expanding less costly privately-run centres. Progress has been made on 6 more recommendations, while little progress was made on 3 recommendations and 2 will not be implemented. The status of each recommendation is detailed in the document.
El documento argumenta que el carbón es la mejor solución a los problemas de abastecimiento energético de Chile. Señala que el gas natural ya no será una opción viable a largo plazo debido a problemas de suministro con Argentina y Bolivia. Otras alternativas como el GNL, la energía nuclear e hidroeléctrica presentan riesgos significativos. En cambio, el carbón tiene reservas abundantes a nivel mundial, puede ser transportado de manera segura, y Chile posee grandes reservas de carbón magallánico que podrían satisfacer las neces
This document describes a corporate presentation for Mine Simulator 3.0 software. It discusses typical problems in mining operations like production variability and congestion. It introduces the software as a way to simulate mining operations and optimize resources and production through modeling. The software allows testing scenarios to determine optimal fleet sizes and allocations without disrupting real operations. It generates detailed statistics on production, routes, queues, and more to evaluate performance.
The document presents a two-level approach for solving stochastic planning problems in operating rooms. At the first level, a deterministic model is used to allocate block times to specialties. At the second level, a stochastic model incorporates random durations to determine if solutions are feasible with high probability. Safety slacks are calculated for blocks likely to exceed durations and fed back into the deterministic model in an iterative process until a robust solution is found. Monte Carlo simulation and the Fenton-Wilkinson approximation are also discussed to model lognormal durations. The approach is applied preliminarily to an operating room case study.
Process improvement and change managementRene Alvarez
1) The document discusses issues with dealing with "real world" problems and resistance to change. It advocates using a logical methodology to identify root causes and achieve agreement on problems and solutions.
2) A key part of the methodology is using cause-effect diagrams to achieve agreement on the problem by identifying root causes. Another part is brainstorming to achieve agreement on the direction of the solution.
3) With agreements in place, a detailed solution can be designed through approaches like analyzing current processes, considering new process models, and evaluating alternative solutions. The goal is an improved process that reduces waste.
This document discusses hospital capacity management and different models used. It proposes a more comprehensive simulation model that considers random patient lengths of stay, bed availability, and patient flows to better predict capacity needs. Such a model could improve long and medium-term planning by predicting occupancy levels and evaluating different scenarios. Building an accurate simulation model requires collecting and analyzing hospital-specific data.
Prediction of the time to complete a series of surgical cases to avoid OR overutilization
1. Prediction of the Time to Complete a Series of Surgical
Cases to Avoid Cardiac Operating Room Overutilization*
Rene Alvarez, MEng
Centre for Research in Healthcare Engineering, Department of
Mechanical and Industrial Engineering at the University of Toronto
St. Michael’s Hospital, Toronto, ON, Canada
Richard Bowry, MB BS FRCA
St. Michael’s Hospital, Toronto, ON, Canada
Faculty of Medicine, University of Toronto
Michael Carter, PhD
Centre for Research in Healthcare Engineering, Department of
Mechanical and Industrial Engineering at the University of Toronto
* Accepted for publication in the Canadian Journal of Anesthesia
Editor: Donald R. Miller, M.D.
Reviewer: Franklin Dexter, M.D., PhD
ORAHS 2010
4. Objective
We present a methodology to accurately
estimate the time to complete a series of
surgical cases in a single cardiac OR to
avoid overutilization when:
the first case starts on time
there are no add-on cases
block time was calculated to match the typical OR
workload
ORAHS 2010
6. OR Efficiency
Efficient OR utilization must account for the cost of
both underutilized and overutilized OR hours
From the accounting perspective, the staffing
expense during scheduled hours is a sunk cost so
the savings for finishing cases early is effectively
zero
A “zero tolerance for overtime” policy may be too
rigid
Therefore, OR efficiency has two competing
priorities:
using all available time to perform cases
control overutilization
ORAHS 2010
7. Overutilization
If for a single OR we assume that:
1. the first case starts on time
2. there are no add-on cases
3. block time was calculated to match the typical OR
workload
Then, overutilization in that OR can be
minimized by accurately estimating the
time required to complete the each case
ORAHS 2010
8. How to estimate surgery duration?
a. Surgeons’ estimation
b. Average time using historical data
c. Historical data combined with the surgeon’s
own estimate
d. A linear prediction model that combined
objective factors with the surgeons’
estimate of operative time
ORAHS 2010
9. Lognormal approximation
Most authors agree that the lognormal
distribution is adequate to represent surgical
times
ORAHS 2010
10. How to sum these random times?
Alvarez et al. (ORAHS 2008) suggested a
methodology based on the Fenton-
Wilkinson Approximation
The Fenton-Wilkinson approach:
gives an accurate estimate particularly in the tail of
the cumulative distribution function
offers a closed-form solution for approximating the
underlying parameters to the lognormal distribution
ORAHS 2010
12. Data: surgeries
We studied 6,090 cases 1. Aortic plus mitral valve
performed by 9 different replacement/repair
cardiovascular surgeons 2. Aortic valve replacement/repair
between January 1st, 2004 3. Aortic valve replacement/repair plus
CABG
and January 30th, 2009 at 4. Ascending aorta plus aortic valve
St. Michael’s Hospital, replacement/repair or CABG
located in Toronto, Ontario, 5. Ascending aorta repair
Canada 6. CABG x1 x2 x3
7. CABG x4 x5 x6
Cases were grouped 8. Chest re-opening/closure
clinically into 13 different 9. Complex
categories 10. Major procedure (with
cardiopulmonary bypass)
Coronary artery bypass
11. Minor procedure
graft surgery (CABG) 12. Mitral valve replacement/repair
accounted for 63.33% of 13. Mitral valve replacement/repair plus
the cases CABG
ORAHS 2010
13. Data: turnover times
We collected data during a five month
period (January 2009-May 2009)
The average turnover time was 0.50 hours,
with a standard deviation of 0.23 hours
ORAHS 2010
14. Lognormal distribution fit
We fitted three parameter lognormal distributions
to surgical times and turnover times
To study the lognormal goodness of fit we:
conducted Kolmogorov-Smirnov tests, and
performed two graphical analyses:
1. comparison of time histograms against the fitted lognormal
distributions
2. probability plots to compare both the real data quantiles
against the lognormal ones
ORAHS 2010
15. Validation
Differences
Percventile FW Real Min %
We selected a schedule PC_5 6.46 6.75 -17.47 -4.31
composed of 2 CABGs PC_10
PC_15
6.79
7.02
7.00
7.17
-12.45
-8.55
-2.96
-1.99
performed by the same PC_20 7.21 7.33 -7.33 -1.67
surgeon (256 historical PC_25 7.38 7.42 -2.47 -0.55
records)
PC_30 7.53 7.50 1.56 0.35
PC_35 7.67 7.75 -5.03 -1.08
We obtained the probability PC_40
PC_45
7.80
7.93
7.83
8.00
-1.91
-3.91
-0.41
-0.81
distribution of this schedule PC_50 8.07 8.08 -0.97 -0.20
using our methodology PC_55 8.20 8.25 -2.93 -0.59
PC_60 8.34 8.42 -4.70 -0.93
We then simulated 1 million PC_65 8.48 8.50 -1.04 -0.20
schedule durations PC_66_66 8.53 8.50 1.96 0.39
PC_70 8.64 8.67 -1.78 -0.34
Finally we compared the PC_75 8.81 8.75 3.36 0.64
simulated ones with the PC_80
PC_85
9.00
9.23
9.08
9.50
-5.17
-16.46
-0.95
-2.89
real durations PC_90 9.52 9.83 -18.96 -3.21
PC_95 9.96 10.25 -17.15 -2.79
ORAHS 2010
16. Estimators for schedule duration
1. The “estimated average duration of the schedule”
calculated as the sum of the average surgical times
and turnover times in the schedule
this “empirical average” is equivalent to the mean value of
the lognormal probability distribution of the schedule
duration.
2. The second tertile cut-off point of the lognormal
distribution obtained using Alvarez et al. (ORAHS
2008) methodology
the time taken for 2/3 of cases to be completed
ORAHS 2010
19. Simultaneous schedule tracking
June-August 2009
138 scheduled blocks
43 blocks were excluded due to last minute
changes to the schedule resulting in
unpredicted delays or case cancellations
95 schedules were analyzed
42 (44.2%) comprised two sequential
coronary artery bypass graft (1 to 3
bypasses) surgeries
ORAHS 2010
20. Prediction of the total duration
Average 2nd tertile cut-off point
0.19 hrs 0.59 hrs
ORAHS 2010
21. Overtime
37 (39.95%) schedules with overtime
average overtime was 65.81 minutes
standard deviation 50.33 minutes
range from 5 minutes to 170 minutes
The estimated average
predicted 44 overrun schedules
32 overran
The second tertile cut-off point
predicted 61 overrun schedules
35 overran
26 false predictions in total:
the real duration of the schedule was on average located at the
26.67% percentile point (standard deviation 17.53%).
ORAHS 2010
23. Lognormal fit
Graphical analyses and computer
simulation validate the lognormal
distribution even in those cases where the
p-value of the traditional goodness of fit
test rejects it
ORAHS 2010
24. Average
We validated the use of the average
duration of a series of surgical cases and
turnover times to estimate total schedule
duration
We found the average value to be located
between the 51% and 53% percentile
points for 74.74% of the cases
ORAHS 2010
25. Overtime prediction capacity
Our results suggest that neither the
estimated average nor the second tertile
cut-off points alone are able to predict the
need for overtime without considerable
false positive results
As suggested by Alvarez et al. (ORAHS
2008) the combined use of the estimated
average schedule duration and the second
tertile cut-off point may help limit overtime
expense
ORAHS 2010
26. Cancellations
An analysis of the cancellation data, where
the second case was cancelled due to
insufficient time, showed that most of the
first cases exceeded the second tertile cut-
off point
This is an expected effect of lognormal
distributed operating times and cannot be
prevented using this methodology
ORAHS 2010
28. Decision rule
Approve an schedule only when the second
tertile cut-off point is less or equal the
block time length plus an “acceptable
overtime” (e.g. 30 minutes for an 8 hours
block time)
ORAHS 2010
29. Prediction of the Time to Complete a Series of Surgical
Cases to Avoid Cardiac Operating Room Overutilization*
Rene Alvarez, MEng
Centre for Research in Healthcare Engineering, Department of
Mechanical and Industrial Engineering at the University of Toronto
St. Michael’s Hospital, Toronto, ON, Canada
Richard Bowry, MB BS FRCA
St. Michael’s Hospital, Toronto, ON, Canada
Faculty of Medicine, University of Toronto
Michael Carter, PhD
Centre for Research in Healthcare Engineering, Department of
Mechanical and Industrial Engineering at the University of Toronto
* Accepted for publication in the Canadian Journal of Anesthesia
Editor: Donald R. Miller, M.D.
Reviewer: Franklin Dexter, M.D., PhD
ORAHS 2010