The concept of motion graphics and its applications.
Socio-Technical Systems Case Study - a more extensive description
1. Philips HealthSuite
Cognitive Ergonomics
Assignment 2: a more extensive description
Alexandra Maria Bartas - 5165369
Rojin Ghorbani Moghadam - 5088135
Jeroen Lieveloo - 4567382
Parastou Raeis Ghanavati - 5242568
2. Introduction
Philips built its HealthSuite Platform on AWS to enable
virtually unlimited scalability, faster time-to-market, and
simplified privacy and security compliance for innovative
healthcare and life science solutions. The HealthSuite
Platform combines the power, security, and flexibility of
AWS services with Philips’ healthcare expertise to bring
innovation to the industry.
Source:
Healthcare compliant Cloud solutions. (z.d.). Philips. Geraadpleegd op 1 september
2021, van https://www.usa.philips.com/healthcare/innovation/about-health-suite
3. Literature (Ale)
Philips HealthSuite digital platform. (2014). Get started with Philips HealthSuite digital platform
[Brochure].https://cf-s3-9b8a2d91-c007-4d01-985a-ef737a876dbd.s3.amazonaws.com/s3fs-
public/PDF/hsdp-brochure.pdf
17. Parasuraman, R., Sheridan, T., & Wickens, C. (2000). A model for types and levels of human interaction with automation. IEEE Transactions
on Systems, Man, and Cybernetics - Part A: Systems and Humans, 30(3), 286–297. https://doi.org/10.1109/3468.844354
The first platform within the “health” theme of this course that we chose to study, is Philips HealthSuite. It is an open service, which means they do not know yet how it will pan out. We envisioned an idyllic future possibility.
Philips HealthSuite promises are Security, Simplicity, Speed and Savings.
Based on the feedback from the last presentation we remade this workflow. In this presentation we compared it to our current workflow seen in the next slide.
The final iteration on interaction map is more focused on the user. Hence the flow begins with the user entering data from medical devices to the system.
Different apps in the system are distinguished, as well as how the big data is made of, and how it contributes to the other businesses
Based on the feedback of this presentation we remade Transitional Map 1 and Transitional Map 2 in the Refined Transitional Map seen on slide 8.But initially in this map, we envisioned that the monitoring process of the patient goes on regardless of any changes in the activity categories. Therefore, a straight and continuous line is drawn across the monitoring activity. However, somewhere in the timeline of the patient being monitored, an abnormality is detected by the ai analysing the user’s data. As a result the patient moves into the diagnosis phase, where the doctor comes in and checks if the AI’s diagnosis is correct or not. If correct, the patient moves up into the treatment phase where both the doctor and the caregiver play a role. Later in the process when the patient is cured, they move to the monitoring phase jumping on diagnosis one.
Based on the feedback of this presentation we remade Transitional Map 1 and Transitional Map 2 in the Refined Transitional Map on slide 8.
In this scenario, the doctor decides that the diagnosis of the AI is incorrect. Therefore the patient moves back to the monitoring phase again.
In the refined transitional map, we considered the different steps within 3 main activities. Instead of a long-term timeline, we focused on the sequence of activities happening before and after a disease diagnosis.
Moreover, non-human actors such as medical devices and the app are included in the process.
The human information processing model is taken from The Proctor & Vu (2010).
The AI in this system follows the HIP just like a human but with the doctor in the loop (as a human in the loop) to ensure responsibility is assigned and potential mistakes are filtered. Then the prescription (which results in treatment) has an effect on the user. Therefore the users input to the Smart devices changes over time and completes the loop.
If the doctor disagrees with the suggestive diagnosis of the AI, then the doctor is able to access the interpreted data in order to make their own diagnosis.
With Interpreted data is meant data with meaning. For example such as 36 degree celsius (interpreted data) instead of a heat-sensor value of 453 (raw data).
Human Information Processing Model with 3 actors:
1. Patient before treatment
The patient starts with using medical devices (Sensory Processing), that help them log in their data and visualize it (Perception Working Memory). This way they can have an overview of it and interpret it (Decision Making). After interpreting it they can choose if they want to share it (for example to their family that wants to be in the loop about their care)(Decision Making and Response Selection).
2. Patient after treatment
The patient would received a notification that they received a prescription (Sensory Processing). They would have time to interpret the prescription (Perception Working Memory), after which they can choose to follow it, or maybe ask for a second opinion or a change (Decision Making). The action that follows is the Response selection.
3. Doctor
The doctor would receive the data and data analysis (e.g. prioritisation)(Sensory processing) after which they can analyse it (Perception Working Memory). The Decision Making Stage is when based on the previous analysis they analyse different treatments and prescription adjustments. In the Response Selection stage they make a decision based on the last stage and share their decision the the patient (or contact them directly).
Technical Information Processing for the 2 interfaces:
eCareCompanion - the interface used by the patient
eCareCoordinator - the interface used by the doctor
Both are further described in the slides 18 and 19.
We used this source for the next slides to indicate the level of automation in the system.
We tried to envision how automatization could work for the two systems (eCareCompanion and eCareCoordinator)
For eCareCompanion during Information Acquisition stage the automatization is low since it is up to the user (patient) to use medical devices to log in data, however the service is only providing a few options (devices from philips such as a scale or blood pressure monitor)
After that the Information Analysis is going on in the background, which is why the aumasation is high.
Decision and action is where the system could suggest additional data for the user to log in or additional devices they could use. At the beginning the automation would be low where it would provide a few suggestions, but in the future the system could determine what are the best data to put in based on existing data or medical history (to make a more complete diagnosis). However, since it contains sensitive information, the human would need to be asked before performing that task.
In the action implementation the information can be shared to the doctor or caregiver or family (so they can be in the loop with the patient care), but only if the patient consents.
3 narrows the selection down to a few
10 decides everything, acts automatically, ignores the human
8 informs the human only if asked
5 executes that suggestion if the human approves
For the eCareCoordinator, Information acquisition stage can be when the data is received from eCareCompanion. The doctor would be notified in this case.
During Information Analysis, the doctor would have access to the analysed data, with the help of the system to e.g. prioritise it. In the future hopefully the system would learn and would be able to do this step by themselves. (Note after presentation: due to automation bias this step would be better if checked by the doctor instead of fully amazing it).
During the Decision and Action the system can give suggestive or alternative diagnosis, or adjustments to the treatment, however in the future it could narrow down the options to a definitive answer (in an ideal case), where the doctor can choose to check it or not.
For the Action Implementation the prescription is shared if the doctor approves.
9 informs, the computer decides
2 offers a complete set of decision or action alternatives
10 decides everything, acts automatically, ignores the human
3 narrows the selection down to a few
8 informs the human only if asked
5 executes that suggestion if the human approves
The UI concept is based on the insights of HIP and TIP. As demonstrated in the slide 19, we aim towards a higher but trustworthy automation. Therefore, we included the “Training” section where the doctors can review data analysis and suggested diagnosis done by AI, make some changes if necessary, and from that, AI would get smarter each time in making decisions for the patients.
Two different versions of the HealthSuite app wireframe is demonstrated in this slide based on the previous information architecture.