Parameters Value
Learning
Rate
- 0.1
- 0.0 to 1.0
- 0.01
Batch Size - 32
- 16 to 256 in powers of 2
Optimizer - Mini-Batch GD
- Adam
Activation
Function
- Leaky ReLU
- Tanh
- Sigmoid
Number of
Hidden
Layers
- 1 hidden layer
- 2 hidden layers
Number of
Hidden
Nodes
- 2/3 (70% or 90%) of input layer
- Less than twice of input layer
- Between input and output layer
- 10 to 18 → 14
Artificial Intelligence for classifying Sleep Apnea patients
Saidah Naqiyah Bte Suleiman, Dr Martin Buist
Department of Biomedical Engineering
AY 2018-2019
Introduction
With the rising prevalence of sleep disorders like sleep apnea in developing nations including Singapore, hospitals are expected to handle an
increasing number of patients and inevitably patient data. Therefore, this project aims to determine whether AI such as neural networks, has the
potential to classify patients according to their severity levels and ultimately replace the manual labour of sieving through information in hospitals
Methodology
Artificial Neural Network Input
Layer
(20
nodes)
• 3,281 patient records from Sleep Heart Health Study
• Training and test sets from data split into 80% to 20%
• Data was normalized to lie between 1 and 0
• 20 input features:
Hidden
Layer
• Number of hidden layers and nodes were varied
Output
Layer
• 2 classes → AHI >10 and <10
• 4 classes → normal (<5), mild (5-15), moderate (15-30) and severe >30)
Training – repeated until the end of epoch
Testing
→Risk factors:
1. BMI
2. Weight
3. Age
4. Gender
5. Neck
circumference
6. Cigarettes per
pack per year
7. Alcohol
consumption
→Complications
8. & 9. high blood pressure (systolic) & (diastolic)
10. Hypertension
11. Diabetes
12. Asthma
13. Frequency of being awakened by heartburn or
indigestion
14. Cholesterol levels
15., 16., 17. & 18. number of oxygen desaturation events
with at least 2%, 3%, 4% and 5% oxygen desaturation
19. & 20. EEG band ratio between the alpha and beta
signals (raw & in seconds)
Feed Data
(Input
Features)
Weights &
Biases
arbitrarily
initialised
Weighted
Sum
Activation
Function
All the
way to
Output
layer
Compute
loss
Optimization
(learning rate,
batch size)
Weights
and
biases
change
- Default value
- Final value
Results & Discussion
Avg. Accuracy Levels (%) VS parameters
→ Learning Rates & Batch Sizes
Most Influential Parameter
Conclusion
Output classification Final accuracy
2 classes 88.13%
4 classes 61.95%
Sensitivity (%)
normal 0.0 moderate 63.13
mild 85.95 severe 68.87
From the results obtained, complete
elimination of manual labour may not be
feasible using the current model. This could
be attributed to the imbalance in the patient
data distribution such as in the different
output classification categories and the
input features. Hence. with greater clinical
data and neural network accuracy, this
study has the potential to replace manual
labour and even overnight PSG studies.
→ Optimizers & Activation Functions
→ Number
of Hidden
Layers
and
Nodes
Batch Size: 32, Learning Rate: 0.01 Batch Size: 128, Learning Rate: 0.1
Leaky ReLU Tanh Sigmoid Leaky ReLU Tanh Sigmoid
Mini-Batch GD 87.70 87.73 82.83 88.28 88.28 85.62
Adam 87.43 87.15 - 87.98 88.13 -

BME Final Year Project Poster 2019

  • 1.
    Parameters Value Learning Rate - 0.1 -0.0 to 1.0 - 0.01 Batch Size - 32 - 16 to 256 in powers of 2 Optimizer - Mini-Batch GD - Adam Activation Function - Leaky ReLU - Tanh - Sigmoid Number of Hidden Layers - 1 hidden layer - 2 hidden layers Number of Hidden Nodes - 2/3 (70% or 90%) of input layer - Less than twice of input layer - Between input and output layer - 10 to 18 → 14 Artificial Intelligence for classifying Sleep Apnea patients Saidah Naqiyah Bte Suleiman, Dr Martin Buist Department of Biomedical Engineering AY 2018-2019 Introduction With the rising prevalence of sleep disorders like sleep apnea in developing nations including Singapore, hospitals are expected to handle an increasing number of patients and inevitably patient data. Therefore, this project aims to determine whether AI such as neural networks, has the potential to classify patients according to their severity levels and ultimately replace the manual labour of sieving through information in hospitals Methodology Artificial Neural Network Input Layer (20 nodes) • 3,281 patient records from Sleep Heart Health Study • Training and test sets from data split into 80% to 20% • Data was normalized to lie between 1 and 0 • 20 input features: Hidden Layer • Number of hidden layers and nodes were varied Output Layer • 2 classes → AHI >10 and <10 • 4 classes → normal (<5), mild (5-15), moderate (15-30) and severe >30) Training – repeated until the end of epoch Testing →Risk factors: 1. BMI 2. Weight 3. Age 4. Gender 5. Neck circumference 6. Cigarettes per pack per year 7. Alcohol consumption →Complications 8. & 9. high blood pressure (systolic) & (diastolic) 10. Hypertension 11. Diabetes 12. Asthma 13. Frequency of being awakened by heartburn or indigestion 14. Cholesterol levels 15., 16., 17. & 18. number of oxygen desaturation events with at least 2%, 3%, 4% and 5% oxygen desaturation 19. & 20. EEG band ratio between the alpha and beta signals (raw & in seconds) Feed Data (Input Features) Weights & Biases arbitrarily initialised Weighted Sum Activation Function All the way to Output layer Compute loss Optimization (learning rate, batch size) Weights and biases change - Default value - Final value Results & Discussion Avg. Accuracy Levels (%) VS parameters → Learning Rates & Batch Sizes Most Influential Parameter Conclusion Output classification Final accuracy 2 classes 88.13% 4 classes 61.95% Sensitivity (%) normal 0.0 moderate 63.13 mild 85.95 severe 68.87 From the results obtained, complete elimination of manual labour may not be feasible using the current model. This could be attributed to the imbalance in the patient data distribution such as in the different output classification categories and the input features. Hence. with greater clinical data and neural network accuracy, this study has the potential to replace manual labour and even overnight PSG studies. → Optimizers & Activation Functions → Number of Hidden Layers and Nodes Batch Size: 32, Learning Rate: 0.01 Batch Size: 128, Learning Rate: 0.1 Leaky ReLU Tanh Sigmoid Leaky ReLU Tanh Sigmoid Mini-Batch GD 87.70 87.73 82.83 88.28 88.28 85.62 Adam 87.43 87.15 - 87.98 88.13 -