SlideShare a Scribd company logo
1 of 34
Download to read offline
(1) Wearer: artifact holding or wearing the mobile device. An artifact may be an inanimate or
animate object.
ISU Mobile Services Group
UCSF School of Medicine
emPATH Open Framework
Care Pathway XML Developer's Guide
Larry Suarez
Version 1.5
emPATH Care Pathways 2
UCSF 10/18/2011
Table&of&Contents&
The&emPATH&Framework .....................................................................................................4!
General&Pathway&Structure ..................................................................................................7!
Interventions...................................................................................................................................8!
Interactions .....................................................................................................................................9!
Transitions.....................................................................................................................................10!
The&Semantic&Web&and&emPATH..................................................................................... 14!
Resource!Identification .................................................................................................................14!
emPATH!Data!Model.....................................................................................................................15!
Care&Pathway&Representation.......................................................................................... 16!
Care!Pathway ................................................................................................................................16!
Interventions.................................................................................................................................18!
Interactions ...................................................................................................................................20!
Responses......................................................................................................................................22!
Transitions.....................................................................................................................................23!
Behaviors.......................................................................................................................................26!
Example!1:!Pain!Survey!(2!Questions)...........................................................................................27!
Example!2:!Pain!Survey!with!Conditional!Transitions!(3!Questions).............................................28!
Example!3:!Pain!Survey!with!Iteration ..........................................................................................31!
emPATH Care Pathways 3
UCSF 10/18/2011
Revision History
Date Rev Author Result
10/16/11 1.0 Larry Suarez Initial document.
10/18/11 1.5 Larry Suarez Numerous edits
emPATH Care Pathways 4
UCSF 10/18/2011
The emPATH Framework
The emPATH Framework provides a platform for researchers and software developers to deliver
mobile-based medical solutions in support of evidence-based medicine (EBM). The primary
means of supporting EBM is through care pathways. Figure 1 shows an example care pathway
that manages a patient wearing a heart rate sensor. The components in the diagram consist of
interventions (blue spheres with tan outer ring) which represent the pathway itself, interactions
(blue spheres) which represent work to do, and decision points (blue triangles) which represent a
point in which a decision is to be made concerning which of multiple paths to take in the pathway.
Arrows between components are known as transitions. Transitions that are drawn into a decision
point are said to "merge" while transitions drawn from a decision point are said to "split".
The sample pathway in Figure 1 shows a number of advance features within the emPATH
framework:
• Parallel branch execution - execute multiple branches in the pathway at the same time
("Extract Vitals" branch and "Notify Care Provider" branch).
• Autonomous task/step execution - execute a task in the pathway without user
intervention ("Extract Vitals", "Monitor Vitals").
• Multi-system interaction - interact with external systems from within the pathway
("Notify Care Provider", "Update EMR").
• Block invocation - invoke pre-defined blocks of tasks/steps from within the pathway
("Schedule Next Intervention" intervention).
• Complex Decision Support - support for AND, OR, and XOR decision branching.
• External sensor support - access sensor data directly from the pathway ("Extract
Vitals").
Figure 1: Sample Care Pathway
Care pathways also support cross-disciplinary work. Medical researchers and computer
scientists use care pathways to communicate their solutions. Care pathways are based on
emPATH Care Pathways 5
UCSF 10/18/2011
technology that is understood by numerous disciplines. There is also a tremendous amount of
research on the manipulation of care pathways to support event-driven solutions. Care pathways
also represent easily identifiable content that can be validated outside the bounds of the emPATH
framework such as when the FDA requires validation of medical applications. emPATH is
designed to execute multiple pathways in parallel in reaction to events occurring around the
wearer
1
. In effect, any external event could cause the mobile device to react via a care pathway
as shown in Figure 2.
Figure 2: Event-Driven emPATH Framework
Care pathways are a well-known construct in medical research. Many national pathways are
published by organizations for care provider consumption. What is typically not well known is the
means in which to express pathways. Pathways are published in many forms including
descriptive text and/or diagrams. Pathways expressed in this paper will consist of two forms:
diagrams borrowing concepts from efforts in the computing industry in the area of workflow and
business process management; and in XML which is a document representation which can be
used to represent directed acyclic graphs. The diagrams are also loosely based on the
mathematical modeling language called Petri Nets.
The Framework and supporting care pathways reside entirely on the mobile device. Care
pathways are represented internally in any number of formats in order to support multiple vendor
solutions. emPATH supports its own representation of care pathways using the XML document
protocol in order to support advance features dealing with dynamic care pathways (or self-*
systems
2
). External rendering systems that interact with the emPATH framework only see one
consistent interface regardless of the internal representation.
emPATH is comprised of two frameworks: a general framework, the Core Framework, which
contains features that are necessary to support mobile medical applications and a second
framework, the EBM Framework, which directly supports dynamic care pathways. A major
feature of the emPATH framework is the internal blackboard system. The blackboard system
1
Artifact holding or wearing the mobile device. An artifact may be an inanimate or animate
object.
2
Pathways that can self-heal (prune), self-organize (change pathway ordering), self-generate
(add new pathways), and self-optimize (prune redundant pathway steps)
emPATH Care Pathways 6
UCSF 10/18/2011
supports connectivity between the services of the Core Framework and the services of the EBM
Framework. All services in emPATH have access to the blackboard system and can view real-
time changes to the blackboard. The blackboard system acts as a "chalk board" where services
can write items of interest that can trigger other services within the mobile application. For
example, a service monitoring patient temperature could write to the blackboard that the patient's
temperature has exceeded a threshold. This could trigger care pathways or other services in the
application. Figure 3 shows an example in which the emPATH blackboard drives the behavior of
a mobile application. A patient-worn pH sensor has posted to the blackboard a pH value of 4.3.
emPATH activates (executes) all pathways that respond to the sensor pH value on the
blackboard. Care pathways can have defined activation criteria based on an ontology.
Pathways may also effect their own processing and even start the processing of other related
pathways merely by populating the blackboard. Figure 4 shows a pathway that affects its own
processing and indirectly starts another pathway.
Figure 4: Self-Affecting Care Pathway
Figure 3: Sensor Activating a Care Pathway
emPATH Care Pathways 7
UCSF 10/18/2011
General Pathway Structure
A mobile application embedding the emPATH Framework can support any number of care
pathways. The mobile application is essentially a collection of care pathways in which the
pathways execute in a myriad of scenarios. For example, in an event-driven solution a pathway
may execute as a result of the wearer's heart rate exceeding a threshold. In swarm-based
solutions
3
, a pathway may execute if the wearer enters an area in which a mobile swarm is
manifesting in reaction to a catastrophic event such as a major earthquake. The designer of the
care pathway is not necessarily aware of how the pathway will execute or under what situation.
emPATH supports a number of explicit and collective behaviors to support very complex
scenarios such as:
• If the wearer has a heart rate that exceeds the defined threshold, execute the related
pathways and notify relevant care providers of the event. Locate a nearby medical
assistant if available and inform their device of the situation and of patient vitals.
• If the wearer is entering an area of concern, warn the wearer with an audible "ping" and
execute the related pathways to guide the wearer for immediate evacuation from the
area.
• Process messages sent from the patient's Electronic Medical Record System to the
wearer's device that indicate a new medication regime has been prescribed by a care
provider. Upload the associated care pathways and place in the internal "pool" of
existing pathways ready for execution when necessary.
• One of the wearer mobile applications is requesting information about the patient's
current state and well-being. The corresponding care pathways start execution and
process the responses from the patient and sensors by updating the internal personal
health record. That update causes the execution of associated care pathways to respond
to any potential patient health episode.
Pathway designers can indicate the goals of a pathway such as lowering cholesterol, weight
reduction, panic attack resolution, etc. These goals can be used to construct mobile solutions to
help manage patients under treatment.
A care pathway consists of five major components or resources:
1. Interventions - an intervention represents a goal defined by the care provider for a
patient.
2. Interactions - individual tasks or steps that are used to achieve a goal defined by an
intervention.
3. Responses - system accepted results from executing an interaction.
4. Decision Points - special type of interactions that represent junctures within a care plan
that indicate a decision is required to determine the correct path in the pathway.
5. Behaviors - processing code that can be referenced within interventions, interactions,
and responses. Behaviors are typically used to do autonomous work for an interaction.
3
Collection of distributed autonomous artifacts displaying collective behavior.
emPATH Care Pathways 8
UCSF 10/18/2011
Interventions
An intervention can be viewed as a container of interactions whose collective task is to achieve a
goal. The interactions within the encompassing intervention are said to be bounded by the
intervention. This is an important concept when describing the world model of a care pathway.
The world model is a description of the world, as it is known to the care pathway. The world
model can help drive the goals of a care pathway. Two identical care pathways can execute
differently based on the state of the world model. The bounded interactions inherit the world
model of the intervention. Interventions may also contain embedded interventions.
Figure 5 shows an example collection of interventions stored on the mobile device. Notice that
the interventions are highly focused and specific to the wearer. The set of interventions within
second-generation
4
mobile solutions will change over time to reflect the changes to the patient's
status. Supporting collections of interventions provides the dynamic nature of second-generation
medical solutions. The collection can change in real-time as the patient's health changes. The
collection can change in reaction to patient medical encounters such as the care provider
requesting the patient to lose weight. Intervention collections provide a new dimension for
developers constructing medical mobile applications.
Figure 5: Focused Mobile-Based Interventions
Figure 6 shows how interventions and interactions are related. The entire pathway is represented
by the intervention and its encompassing interactions.
4
Mobile soluitons which will be more attuned to the device, the wearer, the wearer's afflictions,
and to the wearer's surrounding environment.
emPATH Care Pathways 9
UCSF 10/18/2011
Figure 6: Care Pathway: Interventions and Interactions
Interactions
Interactions represent the core resource within pathways. Interactions do the actual work. Work
includes communicating with external systems, interacting with internal and external sensors,
requesting data from the wearer, and executing algorithms. Figure 7 shows the major parts of an
interaction. The parts consist of:
• PreBehavior - represents local on-board software that executes prior to the execution of
the interaction. Valuable for providing any initialization or setup for the interaction. The
preBehavior software has the ability to change parts of the interaction prior to execution.
• Pathway Behavior. In traditional process or workflow systems, the behavior is the
"worker", "performer", or "implementation" of the interaction. The behavior represents
local on-board software that executes the interaction. A behavior is only present if the
interaction represents an autonomous step. If the interaction supports a Human-in-loop
(HIL), then the rendering engine will use other information in the interaction to
communicate with the wearer.
• PostBehavior - represents local on-board software that executes after completion of the
interaction. PostBehaviors are valuable for providing any post-processing for the
interaction such as persisting data, manipulating response data, or cleaning up after the
interaction.
• Outgoing Transitions - the outgoing transitions represent all transitions that originate
from the interaction and destined to other interactions or decision points.
Figure 7: Interaction Structure
emPATH Care Pathways 10
UCSF 10/18/2011
For interactions that support HIL, the interaction may contain a collection of responses or what is
known as an "answer set". Each response represents a possible "answer" for the interaction.
For example, if the interaction needs to ask the wearer if they have oral pain while eating, one
possible answer or response is "intense". The rendering engine using the emPATH framework is
open to use the answer set in any way it feels beneficial. The rendering engine can ignore the
answer set, augment the answer set through a preBehavior, or follow the answer set verbatim.
Figure 8 shows the relationship between an interaction and it's corresponding answer set.
Figure 8: Interactions and Answer Sets
Transitions
Most care pathways are not simply a sequential set of interactions. The transition from one
interaction to another may be based on the current state of the wearer and the status of the
wearer's health. A pathway designer must be able to indicate in the pathway where a decision is
required to determine the next step of an intervention. emPATH provides three constructs for
changing the path of a care pathway:
• "Skip To" instructions
• Interaction transitions
• Decision Point resources
Skip-to instructions are inserted within response resources and reference specific interactions to
"skip to". If the wearer selects a response and the response contains a skip-to instruction, the
engine will select the next interaction from the instruction. The skip-to instruction may not contain
a condition. The skip-to instruction is always followed if the corresponding response is selected.
Figure 9 shows the use of the skip-to construct.
emPATH Care Pathways 11
UCSF 10/18/2011
Figure 9: Skip-To Instructions
The second transition construct is interaction transitions. These are transitions specified within
the interaction as oppose to within the response. Interaction transitions are typically used when
the interaction is autonomous (no human-in-loop). Figure 10 shows the use of transitions within
an interaction. This type of transition does support conditions that can reference the blackboard.
Any condition that is satisfied will result in a transition. Hence, more than one transition can
occur.
Figure 10: Interaction Transitions
Decision points are the third type of transition construct. Decision points represent specific points
within a pathway where either multiple paths exist in the care pathway or multiple paths merge in
the care pathway. Decision points that are used to support multiple paths in the pathway are
known as "splits". Decision points that are used to support merging paths are called "joins". The
type of decision point determines how many paths are followed or how many paths are merged.
Figures 11 and 12 display the types of splits supported by emPATH. Splits are handled as
follows:
• AND Split: All transitions are followed. There are no conditions related to each transition.
emPATH Care Pathways 12
UCSF 10/18/2011
• OR Split: Any transition whose condition is satisfied is activated.
• XOR Split: The first transition whose condition is satisfied is activated. All others are
ignored.
Figure 13 displays the types of merges supported by emPATH. Merges are handled as follows:
• AND Join: The decision point is not active until all transitions going into the decision point
are active. This decision point allows the emPATH engine to synchronize all inbound
transitions to that point in the pathway. In essence, the engine will wait until all
transitions inbound to the decision point have completed.
• OR Join: The decision point is active when any of the transitions inbound to the decision
point is active. This useful when there are no dependencies among the various
transitions inbound to the decision point.
Figure 11: AND,OR Decision Point Splits
Figure 12: XOR Decision Point Split
emPATH Care Pathways 13
UCSF 10/18/2011
Figure 13: Decision Points Joins
Decision points may also be used to construct iteration patterns within a care pathway. Iteration
patterns are useful for constructing repetitive sequences of interactions such as the processing of
multiple sensor data from a body sensor network. Two decision points mark the beginning and
end of the iteration. One decision point manages the condition that determines if an interaction is
complete. The other decision point manages the iteration loop. Figure 14 shows an iteration
supporting a classic "for loop" construct found in numerous programming languages. The first
decision point manages the condition and the second decision point manages the loop. The first
decision point also provides the transition given that the iteration completes.
Figure 14: Decision Points for Repetitive Loops. Pre-Loop Decision
Figure 15 shows an iteration supporting a classic "while loop" construct found in numerous
programming languages. The first decision point manages the loop and the second decision
point manages the condition. The second decision point also provides the transition given that
the iteration completes.
Figure 15: Decision Points for Repetitive Loops. Post-Loop Decision
emPATH Care Pathways 14
UCSF 10/18/2011
The Semantic Web and emPATH
The healthcare industry and life science research is moving towards the support of open data.
Open data is medical data that can be readily shared among institutions for the benefit of patient
research and care. Shared data includes care pathways, clinical data results, clinical
observations, and real-time medical information. The World Wide Web Consortium (W3C) is
defining a data model and tools for data sharing. That data model and related tools is called the
Semantic Web. emPATH fully complies with the protocols defined by W3C for the Semantic
Web. Data stored on-board the mobile device is in compliance with Semantic Web standards
and can be readily referenced from within care pathways. This is a powerful approach because
emPATH applications can then receive and process data directly from other applications that
follow the Semantic Web protocols. This will be very important since second generation solutions
will support device-to-device communication. In addition, data generated by the mobile device is
externalized in support of the Semantic Web.
The Semantic Web defines the format of the data but not the content. Content values are defined
by ontologies. emPATH can support any number of ontologies defined by leading institutions.
For example, second generation mobile applications generated by Kaiser Permanente can be
designed to generate mobile data so that the data can be readily shared and understood within
Kaiser and externalized to other institutions when beneficial.
Resource Identification
emPATH uses W3C Uniform Resource Identifiers (URIs) for resource identification. Resources
can be defined remotely or locally on-board the mobile device. Care pathways can share
resource definitions. For example, the following URIs reference a behavior with the name
"AnalyzeData" which can be referenced within multiple care pathways. Notice that each URI
indicates the construct (programming language) used to create the behavior and hence how to
execute the resource:
http://www.carethings.com/objc/behaviorId#AnalyzeData
http://www.carethings.com/php/behaviorId#AnalyzeData
http://www.carethings.com/java/behaviorId#AnalyzeData
emPATH follows the "linked data" approach to representing care pathway data as a way of
sharing the care pathways among interested parties. Many resources referenced in a care
pathway use URIs as an identifier. Resources in a pathway include ontologies, system
identifiers, behaviors, text resources, and audio resources.
Interventions, interactions, decision points, and responses are each assigned a system identifier
by the emPATH framework. The system identifier is defined to be web-wide unique. emPATH
uses the URI path "www.carethings.com/sysId" to indicate that the URI represents a resource
identifier. The fragment identifier of the URI is the actual system identifier. Pathway designers
may also define a resource identifier but should be very careful when depending on the identifier
for constructs such as transitions. The following is an example XML in emPATH of an interaction
with both a system identifier and a designer's identifier:
<Interaction>
<SystemID>http://www.carethings.com/sysId#60bd</SystemID>
<ID>http://www.carethings.com/userId#myID</ID>
</Interaction>
emPATH Care Pathways 15
UCSF 10/18/2011
emPATH Data Model
emPATH uses the wearer's health information to drive a number of pathway features including
constraints, world model representation, and clinical data. For example, a pathway designer can
indicate that an intervention is not applicable unless the wearer has diabetes. The designer must
represent that information within emPATH using a data model. emPATH supports the Resource
Description Framework (RDF) Data Model which uses RDF triples to represent information. This
implies that the designer must ultimately represent the constraint of "having diabetes" using RDF
triples.
Lets continue the example that the wearer has diabetes. The RDF triple may look as follows:
(JohnBerner hasAffliction diabetes)
emPATH stores thousands of RDF triples to represent all types of information about the wearer.
RDF triples can be used within the care pathway where data is referenced. The following care
pathway XML snippet represents the condition that the care pathway interaction is only applicable
if the wearer has type 2 diabetes. The pathway uses the <Active> XML element to list triple
patterns that must be satisfied in order for the interaction to be applicable:
<Interaction>
<Active>
<Condition>(JohnBerner hasAffliction diabetes)</Condition>
<Condition>(JohnBerner diabetesType type2)</Condition>
</Active>
</Interaction>
Using RDF triples allows pathway designers to access data from the on-board patient health
record system and data derived from external sources such as sensors. In addition, the pathway
can reference data outside the bounds of the mobile device by using complete URIs within the
RDF triples as follows:
<Interaction>
<Active>
<Condition>
(http://www.aclinic.org/patientName#JohnBerner
http://www.aclinic.org/props/hasAffliction
http://www.aclinic.org/diseases/diabetes)
</Condition>
<Condition>(John diabetesType type2)</Condition>
</Active>
</Interaction>
emPATH Care Pathways 16
UCSF 10/18/2011
Care Pathway Representation
Care pathways can be represented internally within emPATH in any number of protocols.
emPATH will read the pathway from disk (either from a remote server or from the mobile device),
interpret it, and then make the pathway available to the other subsystems of emPATH and the
rendering engine. emPATH subsystems and the rendering engine(s) never see the internal
representation of the pathway. emPATH provides a well-defined object-based interface for
external software systems to use to process the pathway. Figure 16 shows the general flow for
processing Human-In-Loop (HIL) interactions. emPATH provides a well-defined XML
representation of care pathways. Developers are free to extend the emPATH framework to
process other protocols.
Figure 16: Supporting Multiple Care Pathway Formats
The following sections will describe how emPATH pathways are represented using the XML
protocol.
Care Pathway
A care pathway is described by the root XML element <CarePathway>. Within the root element
are the definitions of the interventions, interactions, decision points, and responses. Elements
within the care pathway may specify information about the author of the pathway, any related
studies if this pathway contributes to a clinical trial, and information for the emPATH engine. A
sample XML for a care pathway is as follows:
<CarePathway>
<PathwayXMLAuthor>Larry Suarez</PathwayXMLAuthor>
<PathwayXMLOntology>CocaineMonitoring</PathwayXMLOntology>
<Priority></Priority>
emPATH Care Pathways 17
UCSF 10/18/2011
<Concurrency></Concurrency>
<Resources>
<Resource>
<Type>Camera</Type>
<Identifier>C-1</Identifier>
<Duration></Duration>
</Resource>
<Resource>
<Type>Sensor</Type>
<Identifier>ACC-1</Identifier>
<Duration></Duration>
</Resource>
</Resources>
<Study>
<PrimaryResearcher>
<NameAddress>
<FirstName>Mary</FirstName>
<LastName>Menz</LastName>
</NameAddress>
<URL>http://www.ucsf.edu</URL>
</PrimaryResearcher>
<Participants>
<Participant>
<PartyType>Patient</PartyType>
<PartyIdentifier>
<Type>HAP-ID</Type>
<Identifier>11111111</Identifier>
</PartyIdentifier>
<NameAddress>
<FirstName>Sylvia</FirstName>
<LastName>Sanders</LastName>
</NameAddress>
<URL>http://www.ucsf.edu</URL>
<DateOfBirth>9/26/60</DateOfBirth>
<Gender>F</Gender>
<Sample>Gen-Female</Sample>
</Participant>
</Participants>
<DataCollectors>
<DataCollector>
<PartyType>Nurse</PartyType>
<Organization>String</Organization>
<NameAddress>
<FirstName>Jennifer</FirstName>
<LastName>Larson</LastName>
</NameAddress>
<URL>http://www.ucsf.edu</URL>
</DataCollector>
</DataCollectors>
</Study>
<Interventions></Interventions>
</CarePathway>
emPATH Care Pathways 18
UCSF 10/18/2011
Interventions
An intervention typically represents a goal as defined by a care provider. For example, there may
be an intervention to represent a patient losing fifty pounds. An intervention consists of one or
more interactions. Interventions are useful for grouping interactions under a common ontology.
Interventions provide information that can be shared and accessible by all encompassing
interactions. Interventions may also contain embedded interventions. This is useful to abstract
out collections of interactions. The following XML example shows the general structure of an
intervention consisting of interactions and embedded interventions. Interactions are always
executed in order of appearance in the XML unless changed by Decision Point constructs
(discussed later):
<Interventions>
<Intervention>
<Interactions>
<Interaction></Interaction>
<Interaction></Interaction>
<Interaction></Interaction>
<Intervention></Intervention> //Embedded
</Interactions>
</Intervention>
</Interventions>
The rendering engine can invoke an intervention explicitly. This is typically the case when there
is a Human-in-Loop and the wearer is requesting the intervention. For example, if the intervention
represents a survey. If the intervention is autonomous, the intervention will have an activation
condition that indicates to the emPATH engine when the intervention should start execution. For
example, suppose that the care provider wants their care pathway to execute when the patient's
esophageal pH level rises about 6.0. The XML would look as follows:
<Interventions>
<Intervention>
<Active>
<Condition lang="clips">
(sensor pH ?val)(test (> ?val 6))
</Condition>
</Active>
</Intervention>
</Interventions>
An intervention may also contribute to the wearer’s world model as part of the execution process.
This is useful for setting global information that encompassing interactions need during their
processing. For example, suppose that the intervention contains a number of interactions that
monitor patient status and alerts the care provider of any issues. The intervention would like to
set up the various thresholds used by the alert detection interactions. The XML would look as
follows:
<Interventions>
<Intervention>
<WorldModel>
<Fact lang="clips">(pH alert 6)</Fact>
<Fact lang="clips">(heartRate alert 170)</Fact>
<Fact lang="clips">(weight alert 250)</Fact>
</WorldModel>
</Intervention>
</Interventions>
emPATH Care Pathways 19
UCSF 10/18/2011
The care pathway designer can request that a behavior execute before and/or after the execution
of an intervention. Behaviors that execute before an intervention can be used to set up
environments, initialize constructs, send notifications to providers, and general setup. Executing
behaviors after an intervention can be used to clean up environments, send completion
notifications, and general cleanup. An example XML look as follows:
<Interventions>
<Intervention>
<PreBehavior>
http://www.carethings.com/behavior#SetupAlerts
</PreBehavior>
<PostBehavior>
http://www.carethings.com/behavior#FlushAlerts
</PostBehavior>
</Intervention>
</Interventions>
A major goal for care pathways executing on mobile devices are real-time interventions. To
support real-time interventions, emPATH supports the specification of real-time constraints for
interventions and interactions. For example, suppose we wish that the intervention must
complete within two hours due to health response requirements. An example XML would look as
follows:
<Interventions>
<Intervention>
<RealTimeConstraints>
<MaxDwell>120</MaxDwell>
</RealTimeConstraints>
</Intervention>
</Interventions>
Finally, certain care pathways may require execution at defined times during a care regime. For
example, the care provider may wish the care pathway to execute every other day at 10:00 AM
for daily exercise. Or the provider may wish the care pathway to execute at 8:00 PM every day
so the patient can enter daily diary data. The following XML is an example where the time
constraint is applied only on the 3rd week and 4th day of the clinical study.
<Interventions>
<Intervention>
<RealTimeConstraints>
<Schedule>
<Week>3</Week>
<Day>4</Day>
<AtMost>OnceADay</AtMost>
<TimeRange>
<DateTimeFrom>11</DateTimeFrom>
<DateTimeTo>19</DateTimeTo>
</TimeRange>
</Schedule>
<RealTimeConstraints>
</Intervention>
</Interventions>
emPATH Care Pathways 20
UCSF 10/18/2011
Interactions
Interactions represent the individual steps or tasks required to achieve a goal. Each interaction
has a "type" which indicates the intended meaning of the interaction. An interaction does not
necessarily imply a conversation with the wearer. Interactions may occur autonomously through
the application of behaviors. In addition, an interaction may be ignored completely because of
the wearer’s status. For example, an interaction to "turn on oxygen pump" is ignored if the patient
is not having an episode. emPATH supports the following types of interactions:
• Autonomous - the interaction does not require an external artifact (sensor, data source,
or wearer). Autonomous interactions are typically processed by behaviors.
• Interrogative Multiple Selection - the interaction is querying an external artifact for
information. The artifact may be a sensor, data source, or the wearer. The interaction is
expecting a set of data (more than one piece of information). The set typically consists of
RDF triples.
• Interrogative with Single Selection - the interaction is querying an external artifact for
information. The artifact may be a sensor, data source, or the wearer. The interaction is
expecting only one piece of data typically in the form of an RDF triple.
• Interrogative with Unstructured response - the interaction is querying an external
artifact for information. The artifact may be a sensor, data source, or the wearer. The
interaction is expecting one piece of unstructured data, which may include text, photos,
video, or audio.
• Imperative - the interaction is expressing a command or request to the wearer. This may
be to instruct the wearer (for example, "exit the room") or command an inanimate object
(for example, "turn on oxygen pump").
• Declarative - the interaction is expressing something to the wearer. For example, "job
well done!".
For example, the following XML represents an interaction to ask the wearer how they feel:
<Interaction>
<Text>How do you feel?</Text>
<SystemID>http://www.carethings.com/sysId#60bd</SystemID>
<Type>InterrogativeSingleSelect</Type>
<Responses>
<Response>
<Label>Good</Label>
</Response>
<Response>
<Label>Fair</Label>
</Response>
<Response>
<Label>Poor</Label>
</Response>
</Responses>
</Interaction>
Interactions are quite often used to communicate with the wearer. This is known as a "human-in-
loop" (HIL) because the wearer is directly involved with the interaction. emPATH provides a
number features specifically for HIL interactions. The following features are supported:
• Indicate a text message to be shown to the wearer.
• Indicate a text resource to be shown to the wearer.
emPATH Care Pathways 21
UCSF 10/18/2011
• Indicate an audio message to be played to the wearer.
• Indicate an ontology to be sent to the rendering engine. The assumption is that the
rendering engine can translate the ontology to the appropriate rendering for the wearer.
For example, suppose the interaction needs to determine if the wearer is experiencing pain while
eating. The XML would be as follows in which the rendering engine is give the option of
displaying simple text, displaying rich text (text resource), playing an audio snippet, and/or
providing an ontology:
<Interaction>
<Text>When you are eating, how intense is your pain?</Text>
<TextResource>
http://www.carethings.com/utterance#eatingPain.html
</TextResource>
<AudioResource>
http://www.carethings.com/utterance#audioEatingPain.m4a
</AudioResource>
<Ontology>
http://www.carethings.com/utterance#classEatingPain
</Ontology>
<SystemID>http://www.carethings.com/sysId#60bd</SystemID>
<Type>InterrogativeSingleSelect</Type>
<Responses>
<Response>
<Label>Intense</Label>
</Response>
<Response>
<Label>Mild</Label>
</Response>
<Response>
<Label>No Pain</Label>
</Response>
</Responses>
</Interaction>
Interactions support many of the same features as interventions including real-time constraints,
the ability to update the world model, and conditions that indicate whether the interaction is
active. For example, suppose that the interaction is only applicable if the wearer has breast
cancer. The XML for the interaction would look as follows. If the wearer did not have breast
cancer then the interaction would be skipped.
<Interactions>
<Interaction>
<Active>
<Condition lang="clips">
(patient affliction breastCancer>
</Condition>
</Active>
</Interaction>
</Interactions>
Interactions that support a behavior are said to be autonomous. The behavior is mandated to
execute the interaction. Once the behavior completes processing, the interaction is complete.
The wearer never sees the interaction and hence does not need to respond to the interaction.
The goal of second-generation mobile solutions is to be predominately autonomous. It will be
difficult for second generation and beyond mobile solutions to be HIL since wearers are restricted
emPATH Care Pathways 22
UCSF 10/18/2011
in their available time to react to mobile applications. The XML for an autonomous interaction
would look as follows. If multiple behaviors are indicated, the behaviors are executed in the order
they appear in the XML.
<Interactions>
<Interaction>
<PathwayBehaviors>
<Behavior>
http://www.carethings.com/behavior#AnalyzeData
</Behavior>
<Behavior>
http://www.carethings.com/behavior#NotifyProvider
</Behavior>
</PathwayBehaviors>
</Interaction>
</Interactions>
Responses
Responses represent system-accepted ways of responding to interactions. Responses are
typically only used for HIL interactions. The emPATH framework makes no assumptions on how
the responses are used by the rendering engines. Interactions are not required to have
responses. When responses are available, rendering engines are not required to display the
responses verbatim to the wearer. A response may represent doing work if selected by the
wearer such as enabling the on-board camera. The type of the response indicates the intended
behavior. Accepted response types include:
• Free - the response consists of unstructured text with no data limit.
• FreeFixed - the response consists of unstructured text but limited to some specific
number of characters. Currently the rendering engine defines the limit.
• Directive - the response is considered an "instruction" to the wearer.
• Fixed - the response represents a choice such as a check box.
• FixedNext - the response represents a choice and selection by the wearer results in the
care pathway continuing execution to the next interaction.
• VAS - the response represents a Visual Analog Scale (VAS).
• DVAS - the response represents a Digital Visual Analog Scale.
• Camera - the response, if selected, results in the mobile device activating the on-board
camera.
• Video - the response, if selected, results in the mobile device activating the on-board
video camera.
• Sensor - the response, if selected, results in the mobile device activating the associated
sensor.
• Scan - the response, if selected, results in the mobile device activating the on-board
camera. The resulting photo is assumed to contain a barcode. On-board barcode
scanning software interprets the photo and the resulting barcode, if successful, is stored
in the response object.
• OCR - the response, if selected, results in the mobile device activating the on-board
camera. The resulting photo is assumed to contain text. On-board OCR software
interprets the photo and the resulting text is stored in the response object.
emPATH Care Pathways 23
UCSF 10/18/2011
The response construct contains a number of features that are intended for HIL interaction.
Given the assumption that a response will be visualized on the mobile device for wearer
interaction, the following features are supported.
• Value constraints - a list of values from which a wearer can choose the appropriate value.
For example, the wearer may be asked for their zip code. The value constraints would
list applicable zip codes in the wearer's immediate area.
• Code value - typically a study recognized identifier for the response. Provides the ability
for a researcher to use the pathway in conjunction with existing research software.
• Label - text displayed by the rendering engine for the response. The rendering engine is
free to use the text in any way or manner.
• Label Resource - sometimes the response requires more expressive ways to
communicate with the wearer. The resource is responsible for representing the label
resource. For example, the resource may be an HTML page. The rendering engine will
then render the HTML page as oppose to simple text.
• Format - indicates the format of the response. This is only applicable for responses of
type Free or FreeFixed. The rendering engine is responsible for supporting the format
information. Currently supported formats include numeric, alpha, alphanumeric, date,
datetime, monetary, and phone.
Using a previous example, suppose the interaction needs to determine if the wearer is
experiencing pain while eating. The XML would be as follows:
<Interaction>
<Text>When you are eating, how intense is your pain?</Text>
<SystemID>http://www.carethings.com/sysId#60bd</SystemID>
<Type>InterrogativeSingleSelect</Type>
<Responses>
<Response>
<Label>Intense</Label>
<Code>STUDY1002-INTENSE</Code>
</Response>
<Response>
<Label>Mild</Label>
<Code>STUDY1002-MILD</Code>
</Response>
<Response>
<Label>No Pain</Label>
<Code>STUDY1002-NOPAIN</Code>
</Response>
</Responses>
</Interaction>
Transitions
Transitions in emPATH are supported in three ways: "skip-to" instructions, decision points, and
interaction transitions.
Skip-to instructions are provided within the response constructs of care pathways. The
instructions indicate which intervention to transition to if the response is selected. This is the
simplest form of transitioning supported by emPATH. The wearer is indirectly deciding the
transition for the pathway by selecting a response. emPATH provides the response element
emPATH Care Pathways 24
UCSF 10/18/2011
<SkipTo> to indicate transitioning. The following XML shows the use of the <SkipTo> element for
transitioning.
<Interaction>
<Text>When you are eating, how intense is your pain?</Text>
<SystemID>http://www.carethings.com/sysId#60bd</SystemID>
<Type>InterrogativeSingleSelect</Type>
<Responses>
<Response>
<Label>Intense</Label>
<SkipTo>http://www.carethings.com/sysId#60ccc</SkipTo>
</Response>
<Response>
<Label>Mild</Label>
<SkipTo>http://www.carethings.com/sysId#60ccc</SkipTo>
</Response>
<Response>
<Label>No Pain</Label>
<SkipTo>http://www.carethings.com/sysId#60cff</SkipTo>
</Response>
</Responses>
</Interaction>
Interaction skip-to's are accomplished the same way except that the instruction appears within
the <Interaction> element as follows:
<Interaction>
<Text>When you are eating, how intense is your pain?</Text>
<SystemID>http://www.carethings.com/sysId#60bd</SystemID>
<Type>InterrogativeSingleSelect</Type>
<SkipTo>http://www.carethings.com/sysId#60ccc</SkipTo>
</Interaction>
The skip-to is followed once the interaction has completed processing. If both response skip-to
instructions and interaction skip-to instructions exist, the response instructions take precedence.
Decision points are specific junctures within a care plan that a decision is to be made to
determine the next path in the pathway. Decision points use the emPATH blackboard to derive
the decision. Influencing a decision point can be done by manipulating the blackboard. Decision
points consist of one or more decisions and the resulting interactions. There are four types of
decision points:
• And Split: This is a decision in which more than one path in the care pathway can be
chosen. Typically each path executes in parallel.
• Or Split: This is a decision in which only one path is chosen in the care pathway.
• And Join: This is a decision point where the decision is waiting for more than one path in
the care pathway to complete.
• Or Join: This is a decision point where the decision is waiting for only one path in a care
pathway to complete.
A decision point represented in XML looks very much like an interaction. The following XML
shows an And-Split decision point. The list of possible transitions are expressed using the
<Transition> element.
<DecisionPoint>
emPATH Care Pathways 25
UCSF 10/18/2011
<ID>http://www.carethings.com/userId#DP1</ID>
<Ontology>
http://www.carethings.com/ontology/painValue5
</Ontology>
<SystemID>http://www.carethings.com/sysId#35ae3d</SystemID>
<Type>AndSplit</Type>
<Transitions>
<Transition>
<SystemID>http://www.carethings.com/sysId#2333238</SystemID>
<To>http://www.carethings.com/userId#DP1</To>
<Active>
<Condition>
(?val realtive oralPain) (test exists)
</Condition>
</Active>
</Transition>
<Transition>
<SystemID>http://www.carethings.com/sysId#2333238</SystemID>
<To>http://www.carethings.com/userId#DP1</To>
<Active>
<Condition>
(?val realtive oralPain) (test not exists)
</Condition>
</Active>
</Transition>
</Transitions>
</DecisionPoint>
The join decision points does not require any special XML constructs since their behavior is
provided within emPATH. Only the type value is required for any type of join decision point. An
example And-Join decision point would look as follows.
<DecisionPoint>
<ID>http://www.carethings.com/userId#DP1</ID>
<Ontology>
http://www.carethings.com/ontology/painValue5
</Ontology>
<SystemID>http://www.carethings.com/sysId#35ae3d</SystemID>
<Type>AndJoin</Type>
</DecisionPoint>
The final type of transition involves transitions from within interactions. These transitions do not
involve responses. Transitions from an interaction are used for autonomous interactions. The
following is an example XML:
<Interaction>
<Text>When you are eating, how intense is your pain?</Text>
<SystemID>http://www.carethings.com/sysId#60bd</SystemID>
<Type>Autonomous</Type>
<Transitions>
<Transition>
<SystemID>http://www.carethings.com/sysId#2333238</SystemID>
<To>http://www.carethings.com/userId#DP1</To>
<Active>
<Condition>
(?val realtive oralPain) (test exists)
</Condition>
</Active>
</Transition>
<Transition>
<SystemID>http://www.carethings.com/sysId#2333238</SystemID>
emPATH Care Pathways 26
UCSF 10/18/2011
<To>http://www.carethings.com/userId#DP1</To>
<Active>
<Condition>
(?val realtive oralPain) (test not exists)
</Condition>
</Active>
</Transition>
</Transitions>
</Interaction>
Behaviors
It would difficult to model all aspects of a care pathway within an XML definition. Care pathways
need to execute algorithms that cannot be described through XML. For these situations,
emPATH defines behaviors. Behaviors are assumed to execute within the mobile device. For
example, for Apple devices such as the iPhone and iPad, a behavior is defined as an Objective-C
class. For Android-based devices, a behavior is defined as a Java class. For J2ME-based
phones, behaviors are J2ME Java classes. Behaviors can be referenced within interactions,
interventions, and responses. There are three points at which a behavior can be invoked in an
emPATH resource:
• Before processing of the resource (pre-processing).
• During processing of the resource (autonomous).
• After processing of a resource (post-processing).
For example, suppose the care pathway would like to ask the wearer for their level of pain but
would like to have the question reference the last pain value derived from the on-board personal
health record. The behavior LastPainResponse constructs the appropriate question and adds it
to the interaction.
<Interaction>
<Text>Your last pain level was XXXX. What is your current
level?</Text>
<PreBehavior>LastPainResponse</PreBehavior>
<Type>InterrogativeSingleSelect</Type>
<Responses>
<Response>
<Label>Most intense pain</Label>
</Response>
<Response>
<Label>Tolerable</Label>
</Response>
<Response>
<Label>No pain</Label>
</Response>
</Responses>
</Interaction>
emPATH Care Pathways 27
UCSF 10/18/2011
Example Care Pathways
This section will define a number of care pathways using the emPATH supported XML protocol.
The intent is to allow pathway designers to construct and deploy emPATH care pathways quickly.
The XML examples are defined to be complete and fully functional.
Example 1: Pain Survey (2 Questions)
<CarePathway>
<Interventions>
<Intervention>
<SystemID>http://www.carethings.com/sysId#d2c157</SystemID>
<Interactions>
<Interaction>
<ID>http://www.carethings.com/userId#Q1</ID>
<Text>When you ARE NOT talking, eating, or drinking,
how intense (severe, strong) is the pain in your
mouth?
</Text>
<SystemID>http://www.carethings.com/sysId#35ae3d</SystemID>
<Type>InterrogativeSingleSelect</Type>
<ResponseRequired>YES</ResponseRequired>
<Responses>
<Response>
<SystemID>
http://www.carethings.com/sysId#285510
</SystemID>
<Type>Fixed</Type>
<Label>No pain</Label>
</Response>
<Response>
<SystemID>
http://www.carethings.com/sysId#2222610
</SystemID>
<Type>Fixed</Type>
<Label>No pain</Label>
</Response>
<Response>
<SystemID>
http://www.carethings.com/sysId#266610
</SystemID>
<Type>Fixed</Type>
<Label>
The most intense pain sensation imaginable
</Label>
</Response>
</Responses>
</Interaction>
<Interaction>
<ID>http://www.carethings.com/userId#Q2</ID>
<Text>When you ARE talking, eating, or drinking, how
emPATH Care Pathways 28
UCSF 10/18/2011
intense (severe, strong) is the pain in your mouth?
</Text>
<SystemID>http://www.carethings.com/sysId#f992c7</SystemID>
<Type>InterrogativeSingleSelect</Type>
<ResponseRequired>YES</ResponseRequired>
<Responses>
<Response>
<SystemID>
http://www.carethings.com/sysId#285444
</SystemID>
<Type>Fixed</Type>
<Label>No pain</Label>
</Response>
<Response>
<SystemID>
http://www.carethings.com/sysId#28610
</SystemID>
<Type>Fixed</Type>
<Label>The most intense pain sensation
imaginable
</Label>
</Response>
</Responses>
</Interaction>
<Interaction>
<ID>http://www.carethings.com/userId#iFinalInteraction</ID>
<Text/>
<Type>Imperative</Type>
<PathwayBehaviors>
<Behavior>
http://www.carethings.com/sysBehavior/className#Final
</Behavior>
</PathwayBehaviors>
</Interaction>
</Interactions>
</Intervention>
</Interventions>
</CarePathway>
Example 2: Pain Survey with Conditional Transitions (3 Questions)
<CarePathway>
<Interventions>
<Intervention>
<SystemID>http://www.carethings.com/sysId#d2c157</SystemID>
emPATH Care Pathways 29
UCSF 10/18/2011
<Ontology>
http://www.carethings.com/ontology/mainInteraction
</Ontology>
<Interactions>
<Interaction>
<ID>http://www.carethings.com/userId#Q1</ID>
<Text>When you ARE NOT talking, eating, or drinking,
how intense (severe, strong) is the pain in your
mouth?
</Text>
<Ontology>
http://www.carethings.com/ontology/painValue5
</Ontology>
<SystemID>http://www.carethings.com/sysId#35ae3d</SystemID>
<Type>InterrogativeSingleSelect</Type>
<ResponseRequired>YES</ResponseRequired>
<Responses>
<Response>
<SystemID>
http://www.carethings.com/sysId#254610
</SystemID>
<Type>Fixed</Type>
<Label>No pain</Label>
<WorldModel>
<Fact>
(patient painlevel nopain)
</Fact>
</WorldModel>
</Response>
<Response>
<SystemID>
http://www.carethings.com/sysId#243610
</SystemID>
<Type>Fixed</Type>
<Label>
The most intense pain sensation imaginable
</Label>
<WorldModel>
<Fact>
(patient painlevel intense)
</Fact>
</WorldModel>
</Response>
</Responses>
<Transitions>
<Transition>
<SystemID>
http://www.carethings.com/sysId#233238
</SystemID>
<Active>
<Condition>
(patient painlevel nopain)
</Condition>
</Active>
<To>http://www.carethings.com/sysId#f992c7</To>
</Transition>
<Transition>
<SystemID>
http://www.carethings.com/sysId#2333238
</SystemID>
<Active>
<Condition>
(patient painlevel intense)
emPATH Care Pathways 30
UCSF 10/18/2011
</Condition>
</Active>
<To>http://www.carethings.com/sysId#f99444</To>
</Transition>
</Transitions>
</Interaction>
<Interaction>
<ID>http://www.carethings.com/userId#Q2</ID>
<Text>When you ARE talking, eating, or drinking, how
intense (severe, strong) is the pain in your mouth?
</Text>
<Ontology>
http://www.carethings.com/ontology/interaction
</Ontology>
<SystemID>http://www.carethings.com/sysId#f992c7</SystemID>
<Type>InterrogativeSingleSelect</Type>
<ResponseRequired>YES</ResponseRequired>
<Responses>
<Response>
<SystemID>
http://www.carethings.com/sysId#28
</SystemID>
<Type>Directive</Type>
<Label>No pain</Label>
</Response>
<Response>
<SystemID>
http://www.carethings.com/sysId#28610
</SystemID>
<Type>Directive</Type>
<Label>The most intense pain sensation
imaginable
</Label>
</Response>
</Responses>
</Interaction>
<Interaction>
<ID>http://www.carethings.com/userId#iFinalInteraction</ID>
<Text/>
<Type>Imperative</Type>
<PathwayBehaviors>
<Behavior>
http://www.carethings.com/sysBehavior/className#Final
</Behavior>
</PathwayBehaviors>
</Interaction>
<Interaction>
<ID>http://www.carethings.com/userId#Q3</ID>
<Text>When you ARE NOT talking, eating, or drinking, how
sharp (like a knife) is the pain in your mouth?
</Text>
<Ontology>
http://www.carethings.com/ontology/interaction
</Ontology>
<SystemID>http://www.carethings.com/sysId#f99444</SystemID>
<Type>InterrogativeSingleSelect</Type>
<ResponseRequired>YES</ResponseRequired>
<Responses>
<Response>
<SystemID>
emPATH Care Pathways 31
UCSF 10/18/2011
http://www.carethings.com/sysId#28
</SystemID>
<Type>Directive</Type>
<Label>No pain</Label>
</Response>
<Response>
<SystemID>
http://www.carethings.com/sysId#28610
</SystemID>
<Type>Directive</Type>
<Label>The most intense pain sensation
imaginable
</Label>
</Response>
</Responses>
</Interaction>
<Interaction>
<ID>http://www.carethings.com/userId#iFinalInteraction</ID>
<Text/>
<Type>Imperative</Type>
<PathwayBehaviors>
<Behavior>
http://www.carethings.com/sysBehavior/className#Final
</Behavior>
</PathwayBehaviors>
</Interaction>
</Interactions>
</Intervention>
</Interventions>
</CarePathway>
Example 3: Pain Survey with Iteration
In this example, we are going to list the family members known to the system and then ask the
wearer which of those family members have oral pain. The wearer will then be asked a pain
question for each member.
<ClinicalPathway>
<Interventions>
<Intervention>
<SystemID>http://www.carethings.com/sysId#d2c157</SystemID>
<Ontology>
http://www.carethings.com/ontology/mainInteraction
</Ontology>
<Interactions>
<Interaction>
<ID>http://www.carethings.com/userId#Q1</ID>
<Text>Which of the following members of your family are
having oral pain?
</Text>
emPATH Care Pathways 32
UCSF 10/18/2011
<Ontology>
http://www.carethings.com/ontology/painValue5
</Ontology>
<SystemID>http://www.carethings.com/sysId#35ae3d</SystemID>
<Type>InterrogativeSingleSelect</Type>
<ResponseRequired>YES</ResponseRequired>
<PreBehavior>
http://www.carethings.com/behavior#SetupFamilyMembers
</PreBehavior>
<Responses>
<!--
The response list is generated by the preBehavior:
SetupFamilyMembers.
The following is an example of a generated response
-->
<Response>
<SystemID>
http://www.carethings.com/sysId#28
</SystemID>
<Type>Fixed</Type>
<Label>Janet Farmer</Label>
<WorldModel>
<Fact lang="clips">
(JanetFarmer relative oralPain)
</Fact>
</WorldModel>
</Response>
</Responses>
</Interaction>
<DecisionPoint>
<ID>http://www.carethings.com/userId#DP1</ID>
<Ontology>
http://www.carethings.com/ontology/painValue5
</Ontology>
<SystemID>http://www.carethings.com/sysId#35ae3d</SystemID>
<Type>OrSplit</Type>
<Transitions>
<Transition>
<SystemID>
http://www.carethings.com/sysId#2333238
</SystemID>
<To>http://www.carethings.com/sysId#f111c7</To>
<Active>
<Condition>
(?val realtive oralPain) (test exists)
</Condition>
</Active>
</Transition>
<Transition>
<SystemID>
http://www.carethings.com/sysId#2333238
</SystemID>
<To>http://www.carethings.com/sysId#a112c7</To>
<Active>
<Condition>
(?val realtive oralPain) (test not exists)
</Condition>
</Active>
</Transition>
</Transitions>
</DecisionPoint>
<Interaction>
<ID>http://www.carethings.com/userId#Q2</ID>
emPATH Care Pathways 33
UCSF 10/18/2011
<Text>When {? relative oralPain} IS talking, eating,
or drinking, how intense (severe, strong) is is
the pain in their mouth
</Text>
<Ontology>
http://www.carethings.com/ontology/interaction
</Ontology>
<SystemID>http://www.carethings.com/sysId#f111c7</SystemID>
<Type>InterrogativeSingleSelect</Type>
<ResponseRequired>YES</ResponseRequired>
<Responses>
<Response>
<SystemID>
http://www.carethings.com/sysId#28
</SystemID>
<Type>Fixed</Type>
<Label>No pain</Label>
</Response>
<Response>
<SystemID>
http://www.carethings.com/sysId#28610
</SystemID>
<Type>Fixed</Type>
<Label>The most intense pain sensation
imaginable
</Label>
</Response>
</Responses>
</Interaction>
<DecisionPoint>
<ID>http://www.carethings.com/userId#DP2</ID>
<SystemID>http://www.carethings.com/sysId#35383d</SystemID>
<Type>AndSplit</Type>
<Transitions>
<Transition>
<SystemID>
http://www.carethings.com/sysId#2333238
</SystemID>
<To>http://www.carethings.com/sysId#35ae3d</To>
</Transition>
</Transitions>
</DecisionPoint>
<Interaction>
<ID>http://www.carethings.com/userId#Q3</ID>
<Text>When you ARE NOT talking, eating, or drinking, how
sharp (like a knife) is the pain in your mouth?
</Text>
<Ontology>
http://www.carethings.com/ontology/interaction
</Ontology>
<SystemID>http://www.carethings.com/sysId#a112c7</SystemID>
<Type>InterrogativeSingleSelect</Type>
<ResponseRequired>YES</ResponseRequired>
<Responses>
<Response>
<SystemID>
http://www.carethings.com/sysId#28
</SystemID>
<Type>Fixed</Type>
<Label>No pain</Label>
</Response>
<Response>
<SystemID>
emPATH Care Pathways 34
UCSF 10/18/2011
http://www.carethings.com/sysId#28610
</SystemID>
<Type>Fixed</Type>
<Label>The most intense pain sensation
imaginable
</Label>
</Response>
</Responses>
</Interaction>
<Interaction>
<ID>http://www.carethings.com/userId#iFinalInteraction</ID>
<Text/>
<Type>Imperative</Type>
<PathwayBehaviors>
<Behavior>
http://www.carethings.com/sysBehavior/className#Final
</Behavior>
</PathwayBehaviors>
</Interaction>
</Interactions>
</Intervention>
</Interventions>
</ClinicalPathway>

More Related Content

What's hot

Hospital management system business case
Hospital management system business caseHospital management system business case
Hospital management system business caseNeelam Priya
 
Hospital management
Hospital managementHospital management
Hospital managementVivek Gautam
 
Implementation and Use of ISO EN 13606 and openEHR
Implementation and Use of ISO EN 13606 and openEHRImplementation and Use of ISO EN 13606 and openEHR
Implementation and Use of ISO EN 13606 and openEHRKoray Atalag
 
1335829820.8071 integrated%20hospital%20management%20system%20%5bihms%5d
1335829820.8071 integrated%20hospital%20management%20system%20%5bihms%5d1335829820.8071 integrated%20hospital%20management%20system%20%5bihms%5d
1335829820.8071 integrated%20hospital%20management%20system%20%5bihms%5dashlinrockey
 
An Intelligent Electronic Patient Record Management System (IEPRMS)
An Intelligent Electronic Patient Record Management System (IEPRMS)An Intelligent Electronic Patient Record Management System (IEPRMS)
An Intelligent Electronic Patient Record Management System (IEPRMS)IJCSIS Research Publications
 
Hospital presentation
Hospital presentationHospital presentation
Hospital presentationRANJIT SINGH
 
hospital management system
hospital management systemhospital management system
hospital management systemAnmol Purohit
 

What's hot (9)

EHR White Paper
EHR White PaperEHR White Paper
EHR White Paper
 
Hospital management system business case
Hospital management system business caseHospital management system business case
Hospital management system business case
 
Hospital management
Hospital managementHospital management
Hospital management
 
Use case of hospital managment system
Use case of hospital managment systemUse case of hospital managment system
Use case of hospital managment system
 
Implementation and Use of ISO EN 13606 and openEHR
Implementation and Use of ISO EN 13606 and openEHRImplementation and Use of ISO EN 13606 and openEHR
Implementation and Use of ISO EN 13606 and openEHR
 
1335829820.8071 integrated%20hospital%20management%20system%20%5bihms%5d
1335829820.8071 integrated%20hospital%20management%20system%20%5bihms%5d1335829820.8071 integrated%20hospital%20management%20system%20%5bihms%5d
1335829820.8071 integrated%20hospital%20management%20system%20%5bihms%5d
 
An Intelligent Electronic Patient Record Management System (IEPRMS)
An Intelligent Electronic Patient Record Management System (IEPRMS)An Intelligent Electronic Patient Record Management System (IEPRMS)
An Intelligent Electronic Patient Record Management System (IEPRMS)
 
Hospital presentation
Hospital presentationHospital presentation
Hospital presentation
 
hospital management system
hospital management systemhospital management system
hospital management system
 

Viewers also liked

emPATH Open Sourced Mobile Framework
emPATH Open Sourced Mobile FrameworkemPATH Open Sourced Mobile Framework
emPATH Open Sourced Mobile FrameworkLarry Suarez
 
Predictive Arrival
Predictive ArrivalPredictive Arrival
Predictive ArrivalLarry Suarez
 
Agent-Based Workflow
Agent-Based WorkflowAgent-Based Workflow
Agent-Based WorkflowLarry Suarez
 
The Agent Net AGV Forklift Simulation
The Agent Net AGV Forklift SimulationThe Agent Net AGV Forklift Simulation
The Agent Net AGV Forklift SimulationLarry Suarez
 
Como fazer Fichamento de Texto ou Livro
Como fazer Fichamento de Texto ou LivroComo fazer Fichamento de Texto ou Livro
Como fazer Fichamento de Texto ou LivroINSTITUTO GENS
 

Viewers also liked (7)

emPATH Open Sourced Mobile Framework
emPATH Open Sourced Mobile FrameworkemPATH Open Sourced Mobile Framework
emPATH Open Sourced Mobile Framework
 
Predictive Arrival
Predictive ArrivalPredictive Arrival
Predictive Arrival
 
Agent-Based Workflow
Agent-Based WorkflowAgent-Based Workflow
Agent-Based Workflow
 
MedRPM
MedRPMMedRPM
MedRPM
 
The Agent Net AGV Forklift Simulation
The Agent Net AGV Forklift SimulationThe Agent Net AGV Forklift Simulation
The Agent Net AGV Forklift Simulation
 
The C2 Agent Grid
The C2 Agent GridThe C2 Agent Grid
The C2 Agent Grid
 
Como fazer Fichamento de Texto ou Livro
Como fazer Fichamento de Texto ou LivroComo fazer Fichamento de Texto ou Livro
Como fazer Fichamento de Texto ou Livro
 

Similar to emPATH Developer's Guide

INTERNATIONAL JOURNAL OF HEALTH GEOGRAPHICSKamel Boulos .docx
INTERNATIONAL JOURNAL OF HEALTH GEOGRAPHICSKamel Boulos .docxINTERNATIONAL JOURNAL OF HEALTH GEOGRAPHICSKamel Boulos .docx
INTERNATIONAL JOURNAL OF HEALTH GEOGRAPHICSKamel Boulos .docxnormanibarber20063
 
Telehealthcare for older people: barriers to large-scale roll-outs
Telehealthcare for older people: barriers to large-scale roll-outsTelehealthcare for older people: barriers to large-scale roll-outs
Telehealthcare for older people: barriers to large-scale roll-outsMaged N. Kamel Boulos
 
Mri final report
Mri final reportMri final report
Mri final reportAalap Doshi
 
PATIENT MANAGEMENT SYSTEM project
PATIENT MANAGEMENT SYSTEM projectPATIENT MANAGEMENT SYSTEM project
PATIENT MANAGEMENT SYSTEM projectLaud Randy Amofah
 
2010-sep-16 Services for RIMBAA based on EHR-S FM
2010-sep-16 Services for RIMBAA based on EHR-S FM2010-sep-16 Services for RIMBAA based on EHR-S FM
2010-sep-16 Services for RIMBAA based on EHR-S FMMichael van der Zel
 
IRJET - Blockchain for Medical Data Access and Permission Management
IRJET - Blockchain for Medical Data Access and Permission ManagementIRJET - Blockchain for Medical Data Access and Permission Management
IRJET - Blockchain for Medical Data Access and Permission ManagementIRJET Journal
 
Electronic Medical Regulation
Electronic Medical RegulationElectronic Medical Regulation
Electronic Medical RegulationAditya Chauhan
 
Hospital Managment System Project Proposal
Hospital Managment System Project ProposalHospital Managment System Project Proposal
Hospital Managment System Project ProposalAzeemaj101
 
IRJET - Heart Health Classification and Prediction using Machine Learning
IRJET -  	  Heart Health Classification and Prediction using Machine LearningIRJET -  	  Heart Health Classification and Prediction using Machine Learning
IRJET - Heart Health Classification and Prediction using Machine LearningIRJET Journal
 
Simulated Patient Care Pathways
Simulated Patient Care PathwaysSimulated Patient Care Pathways
Simulated Patient Care PathwaysKevin Russell
 
Ijarcet vol-2-issue-4-1393-1397
Ijarcet vol-2-issue-4-1393-1397Ijarcet vol-2-issue-4-1393-1397
Ijarcet vol-2-issue-4-1393-1397Editor IJARCET
 
The need for interoperability in blockchain-based initiatives to facilitate c...
The need for interoperability in blockchain-based initiatives to facilitate c...The need for interoperability in blockchain-based initiatives to facilitate c...
The need for interoperability in blockchain-based initiatives to facilitate c...Massimiliano Masi
 
Recovery Prediction in the Framework of Cloud-Based Rehabilitation Exergame
Recovery Prediction in the Framework of Cloud-Based Rehabilitation ExergameRecovery Prediction in the Framework of Cloud-Based Rehabilitation Exergame
Recovery Prediction in the Framework of Cloud-Based Rehabilitation Exergametoukaigi
 
2020 book challenges_andtrendsinmultimoda
2020 book challenges_andtrendsinmultimoda2020 book challenges_andtrendsinmultimoda
2020 book challenges_andtrendsinmultimodassuserbf2656
 
Hospital Management System Documentation Java
Hospital Management System Documentation Java Hospital Management System Documentation Java
Hospital Management System Documentation Java Azeemaj101
 
DESIGN OF A SAFE AND SMART MEDICINE BOX
DESIGN OF A SAFE AND SMART MEDICINE BOX DESIGN OF A SAFE AND SMART MEDICINE BOX
DESIGN OF A SAFE AND SMART MEDICINE BOX ijbesjournal
 

Similar to emPATH Developer's Guide (20)

Smart-X: an Adaptive Multi-Agent Platform for Smart-Topics
Smart-X: an Adaptive Multi-Agent Platform for Smart-TopicsSmart-X: an Adaptive Multi-Agent Platform for Smart-Topics
Smart-X: an Adaptive Multi-Agent Platform for Smart-Topics
 
INTERNATIONAL JOURNAL OF HEALTH GEOGRAPHICSKamel Boulos .docx
INTERNATIONAL JOURNAL OF HEALTH GEOGRAPHICSKamel Boulos .docxINTERNATIONAL JOURNAL OF HEALTH GEOGRAPHICSKamel Boulos .docx
INTERNATIONAL JOURNAL OF HEALTH GEOGRAPHICSKamel Boulos .docx
 
Telehealthcare for older people: barriers to large-scale roll-outs
Telehealthcare for older people: barriers to large-scale roll-outsTelehealthcare for older people: barriers to large-scale roll-outs
Telehealthcare for older people: barriers to large-scale roll-outs
 
Mri final report
Mri final reportMri final report
Mri final report
 
PATIENT MANAGEMENT SYSTEM project
PATIENT MANAGEMENT SYSTEM projectPATIENT MANAGEMENT SYSTEM project
PATIENT MANAGEMENT SYSTEM project
 
2010-sep-16 Services for RIMBAA based on EHR-S FM
2010-sep-16 Services for RIMBAA based on EHR-S FM2010-sep-16 Services for RIMBAA based on EHR-S FM
2010-sep-16 Services for RIMBAA based on EHR-S FM
 
IRJET - Blockchain for Medical Data Access and Permission Management
IRJET - Blockchain for Medical Data Access and Permission ManagementIRJET - Blockchain for Medical Data Access and Permission Management
IRJET - Blockchain for Medical Data Access and Permission Management
 
Electronic Medical Regulation
Electronic Medical RegulationElectronic Medical Regulation
Electronic Medical Regulation
 
Hospital Managment System Project Proposal
Hospital Managment System Project ProposalHospital Managment System Project Proposal
Hospital Managment System Project Proposal
 
IRJET - Heart Health Classification and Prediction using Machine Learning
IRJET -  	  Heart Health Classification and Prediction using Machine LearningIRJET -  	  Heart Health Classification and Prediction using Machine Learning
IRJET - Heart Health Classification and Prediction using Machine Learning
 
Simulated Patient Care Pathways
Simulated Patient Care PathwaysSimulated Patient Care Pathways
Simulated Patient Care Pathways
 
Ijarcet vol-2-issue-4-1393-1397
Ijarcet vol-2-issue-4-1393-1397Ijarcet vol-2-issue-4-1393-1397
Ijarcet vol-2-issue-4-1393-1397
 
[IJET-V2I1P10] Authors:Prof.Dr.Pramod Patil, Sneha Chhanchure, Asmita Dhage, ...
[IJET-V2I1P10] Authors:Prof.Dr.Pramod Patil, Sneha Chhanchure, Asmita Dhage, ...[IJET-V2I1P10] Authors:Prof.Dr.Pramod Patil, Sneha Chhanchure, Asmita Dhage, ...
[IJET-V2I1P10] Authors:Prof.Dr.Pramod Patil, Sneha Chhanchure, Asmita Dhage, ...
 
The need for interoperability in blockchain-based initiatives to facilitate c...
The need for interoperability in blockchain-based initiatives to facilitate c...The need for interoperability in blockchain-based initiatives to facilitate c...
The need for interoperability in blockchain-based initiatives to facilitate c...
 
Recovery Prediction in the Framework of Cloud-Based Rehabilitation Exergame
Recovery Prediction in the Framework of Cloud-Based Rehabilitation ExergameRecovery Prediction in the Framework of Cloud-Based Rehabilitation Exergame
Recovery Prediction in the Framework of Cloud-Based Rehabilitation Exergame
 
Protable ultrasound
Protable ultrasoundProtable ultrasound
Protable ultrasound
 
Protable ultrasound
Protable ultrasoundProtable ultrasound
Protable ultrasound
 
2020 book challenges_andtrendsinmultimoda
2020 book challenges_andtrendsinmultimoda2020 book challenges_andtrendsinmultimoda
2020 book challenges_andtrendsinmultimoda
 
Hospital Management System Documentation Java
Hospital Management System Documentation Java Hospital Management System Documentation Java
Hospital Management System Documentation Java
 
DESIGN OF A SAFE AND SMART MEDICINE BOX
DESIGN OF A SAFE AND SMART MEDICINE BOX DESIGN OF A SAFE AND SMART MEDICINE BOX
DESIGN OF A SAFE AND SMART MEDICINE BOX
 

Recently uploaded

Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...AliaaTarek5
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxLoriGlavin3
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESmohitsingh558521
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfMounikaPolabathina
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsNathaniel Shimoni
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 
Scale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterScale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterMydbops
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 

Recently uploaded (20)

Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
(How to Program) Paul Deitel, Harvey Deitel-Java How to Program, Early Object...
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdf
 
Time Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directionsTime Series Foundation Models - current state and future directions
Time Series Foundation Models - current state and future directions
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 
Scale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterScale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL Router
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 

emPATH Developer's Guide

  • 1. (1) Wearer: artifact holding or wearing the mobile device. An artifact may be an inanimate or animate object. ISU Mobile Services Group UCSF School of Medicine emPATH Open Framework Care Pathway XML Developer's Guide Larry Suarez Version 1.5
  • 2. emPATH Care Pathways 2 UCSF 10/18/2011 Table&of&Contents& The&emPATH&Framework .....................................................................................................4! General&Pathway&Structure ..................................................................................................7! Interventions...................................................................................................................................8! Interactions .....................................................................................................................................9! Transitions.....................................................................................................................................10! The&Semantic&Web&and&emPATH..................................................................................... 14! Resource!Identification .................................................................................................................14! emPATH!Data!Model.....................................................................................................................15! Care&Pathway&Representation.......................................................................................... 16! Care!Pathway ................................................................................................................................16! Interventions.................................................................................................................................18! Interactions ...................................................................................................................................20! Responses......................................................................................................................................22! Transitions.....................................................................................................................................23! Behaviors.......................................................................................................................................26! Example!1:!Pain!Survey!(2!Questions)...........................................................................................27! Example!2:!Pain!Survey!with!Conditional!Transitions!(3!Questions).............................................28! Example!3:!Pain!Survey!with!Iteration ..........................................................................................31!
  • 3. emPATH Care Pathways 3 UCSF 10/18/2011 Revision History Date Rev Author Result 10/16/11 1.0 Larry Suarez Initial document. 10/18/11 1.5 Larry Suarez Numerous edits
  • 4. emPATH Care Pathways 4 UCSF 10/18/2011 The emPATH Framework The emPATH Framework provides a platform for researchers and software developers to deliver mobile-based medical solutions in support of evidence-based medicine (EBM). The primary means of supporting EBM is through care pathways. Figure 1 shows an example care pathway that manages a patient wearing a heart rate sensor. The components in the diagram consist of interventions (blue spheres with tan outer ring) which represent the pathway itself, interactions (blue spheres) which represent work to do, and decision points (blue triangles) which represent a point in which a decision is to be made concerning which of multiple paths to take in the pathway. Arrows between components are known as transitions. Transitions that are drawn into a decision point are said to "merge" while transitions drawn from a decision point are said to "split". The sample pathway in Figure 1 shows a number of advance features within the emPATH framework: • Parallel branch execution - execute multiple branches in the pathway at the same time ("Extract Vitals" branch and "Notify Care Provider" branch). • Autonomous task/step execution - execute a task in the pathway without user intervention ("Extract Vitals", "Monitor Vitals"). • Multi-system interaction - interact with external systems from within the pathway ("Notify Care Provider", "Update EMR"). • Block invocation - invoke pre-defined blocks of tasks/steps from within the pathway ("Schedule Next Intervention" intervention). • Complex Decision Support - support for AND, OR, and XOR decision branching. • External sensor support - access sensor data directly from the pathway ("Extract Vitals"). Figure 1: Sample Care Pathway Care pathways also support cross-disciplinary work. Medical researchers and computer scientists use care pathways to communicate their solutions. Care pathways are based on
  • 5. emPATH Care Pathways 5 UCSF 10/18/2011 technology that is understood by numerous disciplines. There is also a tremendous amount of research on the manipulation of care pathways to support event-driven solutions. Care pathways also represent easily identifiable content that can be validated outside the bounds of the emPATH framework such as when the FDA requires validation of medical applications. emPATH is designed to execute multiple pathways in parallel in reaction to events occurring around the wearer 1 . In effect, any external event could cause the mobile device to react via a care pathway as shown in Figure 2. Figure 2: Event-Driven emPATH Framework Care pathways are a well-known construct in medical research. Many national pathways are published by organizations for care provider consumption. What is typically not well known is the means in which to express pathways. Pathways are published in many forms including descriptive text and/or diagrams. Pathways expressed in this paper will consist of two forms: diagrams borrowing concepts from efforts in the computing industry in the area of workflow and business process management; and in XML which is a document representation which can be used to represent directed acyclic graphs. The diagrams are also loosely based on the mathematical modeling language called Petri Nets. The Framework and supporting care pathways reside entirely on the mobile device. Care pathways are represented internally in any number of formats in order to support multiple vendor solutions. emPATH supports its own representation of care pathways using the XML document protocol in order to support advance features dealing with dynamic care pathways (or self-* systems 2 ). External rendering systems that interact with the emPATH framework only see one consistent interface regardless of the internal representation. emPATH is comprised of two frameworks: a general framework, the Core Framework, which contains features that are necessary to support mobile medical applications and a second framework, the EBM Framework, which directly supports dynamic care pathways. A major feature of the emPATH framework is the internal blackboard system. The blackboard system 1 Artifact holding or wearing the mobile device. An artifact may be an inanimate or animate object. 2 Pathways that can self-heal (prune), self-organize (change pathway ordering), self-generate (add new pathways), and self-optimize (prune redundant pathway steps)
  • 6. emPATH Care Pathways 6 UCSF 10/18/2011 supports connectivity between the services of the Core Framework and the services of the EBM Framework. All services in emPATH have access to the blackboard system and can view real- time changes to the blackboard. The blackboard system acts as a "chalk board" where services can write items of interest that can trigger other services within the mobile application. For example, a service monitoring patient temperature could write to the blackboard that the patient's temperature has exceeded a threshold. This could trigger care pathways or other services in the application. Figure 3 shows an example in which the emPATH blackboard drives the behavior of a mobile application. A patient-worn pH sensor has posted to the blackboard a pH value of 4.3. emPATH activates (executes) all pathways that respond to the sensor pH value on the blackboard. Care pathways can have defined activation criteria based on an ontology. Pathways may also effect their own processing and even start the processing of other related pathways merely by populating the blackboard. Figure 4 shows a pathway that affects its own processing and indirectly starts another pathway. Figure 4: Self-Affecting Care Pathway Figure 3: Sensor Activating a Care Pathway
  • 7. emPATH Care Pathways 7 UCSF 10/18/2011 General Pathway Structure A mobile application embedding the emPATH Framework can support any number of care pathways. The mobile application is essentially a collection of care pathways in which the pathways execute in a myriad of scenarios. For example, in an event-driven solution a pathway may execute as a result of the wearer's heart rate exceeding a threshold. In swarm-based solutions 3 , a pathway may execute if the wearer enters an area in which a mobile swarm is manifesting in reaction to a catastrophic event such as a major earthquake. The designer of the care pathway is not necessarily aware of how the pathway will execute or under what situation. emPATH supports a number of explicit and collective behaviors to support very complex scenarios such as: • If the wearer has a heart rate that exceeds the defined threshold, execute the related pathways and notify relevant care providers of the event. Locate a nearby medical assistant if available and inform their device of the situation and of patient vitals. • If the wearer is entering an area of concern, warn the wearer with an audible "ping" and execute the related pathways to guide the wearer for immediate evacuation from the area. • Process messages sent from the patient's Electronic Medical Record System to the wearer's device that indicate a new medication regime has been prescribed by a care provider. Upload the associated care pathways and place in the internal "pool" of existing pathways ready for execution when necessary. • One of the wearer mobile applications is requesting information about the patient's current state and well-being. The corresponding care pathways start execution and process the responses from the patient and sensors by updating the internal personal health record. That update causes the execution of associated care pathways to respond to any potential patient health episode. Pathway designers can indicate the goals of a pathway such as lowering cholesterol, weight reduction, panic attack resolution, etc. These goals can be used to construct mobile solutions to help manage patients under treatment. A care pathway consists of five major components or resources: 1. Interventions - an intervention represents a goal defined by the care provider for a patient. 2. Interactions - individual tasks or steps that are used to achieve a goal defined by an intervention. 3. Responses - system accepted results from executing an interaction. 4. Decision Points - special type of interactions that represent junctures within a care plan that indicate a decision is required to determine the correct path in the pathway. 5. Behaviors - processing code that can be referenced within interventions, interactions, and responses. Behaviors are typically used to do autonomous work for an interaction. 3 Collection of distributed autonomous artifacts displaying collective behavior.
  • 8. emPATH Care Pathways 8 UCSF 10/18/2011 Interventions An intervention can be viewed as a container of interactions whose collective task is to achieve a goal. The interactions within the encompassing intervention are said to be bounded by the intervention. This is an important concept when describing the world model of a care pathway. The world model is a description of the world, as it is known to the care pathway. The world model can help drive the goals of a care pathway. Two identical care pathways can execute differently based on the state of the world model. The bounded interactions inherit the world model of the intervention. Interventions may also contain embedded interventions. Figure 5 shows an example collection of interventions stored on the mobile device. Notice that the interventions are highly focused and specific to the wearer. The set of interventions within second-generation 4 mobile solutions will change over time to reflect the changes to the patient's status. Supporting collections of interventions provides the dynamic nature of second-generation medical solutions. The collection can change in real-time as the patient's health changes. The collection can change in reaction to patient medical encounters such as the care provider requesting the patient to lose weight. Intervention collections provide a new dimension for developers constructing medical mobile applications. Figure 5: Focused Mobile-Based Interventions Figure 6 shows how interventions and interactions are related. The entire pathway is represented by the intervention and its encompassing interactions. 4 Mobile soluitons which will be more attuned to the device, the wearer, the wearer's afflictions, and to the wearer's surrounding environment.
  • 9. emPATH Care Pathways 9 UCSF 10/18/2011 Figure 6: Care Pathway: Interventions and Interactions Interactions Interactions represent the core resource within pathways. Interactions do the actual work. Work includes communicating with external systems, interacting with internal and external sensors, requesting data from the wearer, and executing algorithms. Figure 7 shows the major parts of an interaction. The parts consist of: • PreBehavior - represents local on-board software that executes prior to the execution of the interaction. Valuable for providing any initialization or setup for the interaction. The preBehavior software has the ability to change parts of the interaction prior to execution. • Pathway Behavior. In traditional process or workflow systems, the behavior is the "worker", "performer", or "implementation" of the interaction. The behavior represents local on-board software that executes the interaction. A behavior is only present if the interaction represents an autonomous step. If the interaction supports a Human-in-loop (HIL), then the rendering engine will use other information in the interaction to communicate with the wearer. • PostBehavior - represents local on-board software that executes after completion of the interaction. PostBehaviors are valuable for providing any post-processing for the interaction such as persisting data, manipulating response data, or cleaning up after the interaction. • Outgoing Transitions - the outgoing transitions represent all transitions that originate from the interaction and destined to other interactions or decision points. Figure 7: Interaction Structure
  • 10. emPATH Care Pathways 10 UCSF 10/18/2011 For interactions that support HIL, the interaction may contain a collection of responses or what is known as an "answer set". Each response represents a possible "answer" for the interaction. For example, if the interaction needs to ask the wearer if they have oral pain while eating, one possible answer or response is "intense". The rendering engine using the emPATH framework is open to use the answer set in any way it feels beneficial. The rendering engine can ignore the answer set, augment the answer set through a preBehavior, or follow the answer set verbatim. Figure 8 shows the relationship between an interaction and it's corresponding answer set. Figure 8: Interactions and Answer Sets Transitions Most care pathways are not simply a sequential set of interactions. The transition from one interaction to another may be based on the current state of the wearer and the status of the wearer's health. A pathway designer must be able to indicate in the pathway where a decision is required to determine the next step of an intervention. emPATH provides three constructs for changing the path of a care pathway: • "Skip To" instructions • Interaction transitions • Decision Point resources Skip-to instructions are inserted within response resources and reference specific interactions to "skip to". If the wearer selects a response and the response contains a skip-to instruction, the engine will select the next interaction from the instruction. The skip-to instruction may not contain a condition. The skip-to instruction is always followed if the corresponding response is selected. Figure 9 shows the use of the skip-to construct.
  • 11. emPATH Care Pathways 11 UCSF 10/18/2011 Figure 9: Skip-To Instructions The second transition construct is interaction transitions. These are transitions specified within the interaction as oppose to within the response. Interaction transitions are typically used when the interaction is autonomous (no human-in-loop). Figure 10 shows the use of transitions within an interaction. This type of transition does support conditions that can reference the blackboard. Any condition that is satisfied will result in a transition. Hence, more than one transition can occur. Figure 10: Interaction Transitions Decision points are the third type of transition construct. Decision points represent specific points within a pathway where either multiple paths exist in the care pathway or multiple paths merge in the care pathway. Decision points that are used to support multiple paths in the pathway are known as "splits". Decision points that are used to support merging paths are called "joins". The type of decision point determines how many paths are followed or how many paths are merged. Figures 11 and 12 display the types of splits supported by emPATH. Splits are handled as follows: • AND Split: All transitions are followed. There are no conditions related to each transition.
  • 12. emPATH Care Pathways 12 UCSF 10/18/2011 • OR Split: Any transition whose condition is satisfied is activated. • XOR Split: The first transition whose condition is satisfied is activated. All others are ignored. Figure 13 displays the types of merges supported by emPATH. Merges are handled as follows: • AND Join: The decision point is not active until all transitions going into the decision point are active. This decision point allows the emPATH engine to synchronize all inbound transitions to that point in the pathway. In essence, the engine will wait until all transitions inbound to the decision point have completed. • OR Join: The decision point is active when any of the transitions inbound to the decision point is active. This useful when there are no dependencies among the various transitions inbound to the decision point. Figure 11: AND,OR Decision Point Splits Figure 12: XOR Decision Point Split
  • 13. emPATH Care Pathways 13 UCSF 10/18/2011 Figure 13: Decision Points Joins Decision points may also be used to construct iteration patterns within a care pathway. Iteration patterns are useful for constructing repetitive sequences of interactions such as the processing of multiple sensor data from a body sensor network. Two decision points mark the beginning and end of the iteration. One decision point manages the condition that determines if an interaction is complete. The other decision point manages the iteration loop. Figure 14 shows an iteration supporting a classic "for loop" construct found in numerous programming languages. The first decision point manages the condition and the second decision point manages the loop. The first decision point also provides the transition given that the iteration completes. Figure 14: Decision Points for Repetitive Loops. Pre-Loop Decision Figure 15 shows an iteration supporting a classic "while loop" construct found in numerous programming languages. The first decision point manages the loop and the second decision point manages the condition. The second decision point also provides the transition given that the iteration completes. Figure 15: Decision Points for Repetitive Loops. Post-Loop Decision
  • 14. emPATH Care Pathways 14 UCSF 10/18/2011 The Semantic Web and emPATH The healthcare industry and life science research is moving towards the support of open data. Open data is medical data that can be readily shared among institutions for the benefit of patient research and care. Shared data includes care pathways, clinical data results, clinical observations, and real-time medical information. The World Wide Web Consortium (W3C) is defining a data model and tools for data sharing. That data model and related tools is called the Semantic Web. emPATH fully complies with the protocols defined by W3C for the Semantic Web. Data stored on-board the mobile device is in compliance with Semantic Web standards and can be readily referenced from within care pathways. This is a powerful approach because emPATH applications can then receive and process data directly from other applications that follow the Semantic Web protocols. This will be very important since second generation solutions will support device-to-device communication. In addition, data generated by the mobile device is externalized in support of the Semantic Web. The Semantic Web defines the format of the data but not the content. Content values are defined by ontologies. emPATH can support any number of ontologies defined by leading institutions. For example, second generation mobile applications generated by Kaiser Permanente can be designed to generate mobile data so that the data can be readily shared and understood within Kaiser and externalized to other institutions when beneficial. Resource Identification emPATH uses W3C Uniform Resource Identifiers (URIs) for resource identification. Resources can be defined remotely or locally on-board the mobile device. Care pathways can share resource definitions. For example, the following URIs reference a behavior with the name "AnalyzeData" which can be referenced within multiple care pathways. Notice that each URI indicates the construct (programming language) used to create the behavior and hence how to execute the resource: http://www.carethings.com/objc/behaviorId#AnalyzeData http://www.carethings.com/php/behaviorId#AnalyzeData http://www.carethings.com/java/behaviorId#AnalyzeData emPATH follows the "linked data" approach to representing care pathway data as a way of sharing the care pathways among interested parties. Many resources referenced in a care pathway use URIs as an identifier. Resources in a pathway include ontologies, system identifiers, behaviors, text resources, and audio resources. Interventions, interactions, decision points, and responses are each assigned a system identifier by the emPATH framework. The system identifier is defined to be web-wide unique. emPATH uses the URI path "www.carethings.com/sysId" to indicate that the URI represents a resource identifier. The fragment identifier of the URI is the actual system identifier. Pathway designers may also define a resource identifier but should be very careful when depending on the identifier for constructs such as transitions. The following is an example XML in emPATH of an interaction with both a system identifier and a designer's identifier: <Interaction> <SystemID>http://www.carethings.com/sysId#60bd</SystemID> <ID>http://www.carethings.com/userId#myID</ID> </Interaction>
  • 15. emPATH Care Pathways 15 UCSF 10/18/2011 emPATH Data Model emPATH uses the wearer's health information to drive a number of pathway features including constraints, world model representation, and clinical data. For example, a pathway designer can indicate that an intervention is not applicable unless the wearer has diabetes. The designer must represent that information within emPATH using a data model. emPATH supports the Resource Description Framework (RDF) Data Model which uses RDF triples to represent information. This implies that the designer must ultimately represent the constraint of "having diabetes" using RDF triples. Lets continue the example that the wearer has diabetes. The RDF triple may look as follows: (JohnBerner hasAffliction diabetes) emPATH stores thousands of RDF triples to represent all types of information about the wearer. RDF triples can be used within the care pathway where data is referenced. The following care pathway XML snippet represents the condition that the care pathway interaction is only applicable if the wearer has type 2 diabetes. The pathway uses the <Active> XML element to list triple patterns that must be satisfied in order for the interaction to be applicable: <Interaction> <Active> <Condition>(JohnBerner hasAffliction diabetes)</Condition> <Condition>(JohnBerner diabetesType type2)</Condition> </Active> </Interaction> Using RDF triples allows pathway designers to access data from the on-board patient health record system and data derived from external sources such as sensors. In addition, the pathway can reference data outside the bounds of the mobile device by using complete URIs within the RDF triples as follows: <Interaction> <Active> <Condition> (http://www.aclinic.org/patientName#JohnBerner http://www.aclinic.org/props/hasAffliction http://www.aclinic.org/diseases/diabetes) </Condition> <Condition>(John diabetesType type2)</Condition> </Active> </Interaction>
  • 16. emPATH Care Pathways 16 UCSF 10/18/2011 Care Pathway Representation Care pathways can be represented internally within emPATH in any number of protocols. emPATH will read the pathway from disk (either from a remote server or from the mobile device), interpret it, and then make the pathway available to the other subsystems of emPATH and the rendering engine. emPATH subsystems and the rendering engine(s) never see the internal representation of the pathway. emPATH provides a well-defined object-based interface for external software systems to use to process the pathway. Figure 16 shows the general flow for processing Human-In-Loop (HIL) interactions. emPATH provides a well-defined XML representation of care pathways. Developers are free to extend the emPATH framework to process other protocols. Figure 16: Supporting Multiple Care Pathway Formats The following sections will describe how emPATH pathways are represented using the XML protocol. Care Pathway A care pathway is described by the root XML element <CarePathway>. Within the root element are the definitions of the interventions, interactions, decision points, and responses. Elements within the care pathway may specify information about the author of the pathway, any related studies if this pathway contributes to a clinical trial, and information for the emPATH engine. A sample XML for a care pathway is as follows: <CarePathway> <PathwayXMLAuthor>Larry Suarez</PathwayXMLAuthor> <PathwayXMLOntology>CocaineMonitoring</PathwayXMLOntology> <Priority></Priority>
  • 17. emPATH Care Pathways 17 UCSF 10/18/2011 <Concurrency></Concurrency> <Resources> <Resource> <Type>Camera</Type> <Identifier>C-1</Identifier> <Duration></Duration> </Resource> <Resource> <Type>Sensor</Type> <Identifier>ACC-1</Identifier> <Duration></Duration> </Resource> </Resources> <Study> <PrimaryResearcher> <NameAddress> <FirstName>Mary</FirstName> <LastName>Menz</LastName> </NameAddress> <URL>http://www.ucsf.edu</URL> </PrimaryResearcher> <Participants> <Participant> <PartyType>Patient</PartyType> <PartyIdentifier> <Type>HAP-ID</Type> <Identifier>11111111</Identifier> </PartyIdentifier> <NameAddress> <FirstName>Sylvia</FirstName> <LastName>Sanders</LastName> </NameAddress> <URL>http://www.ucsf.edu</URL> <DateOfBirth>9/26/60</DateOfBirth> <Gender>F</Gender> <Sample>Gen-Female</Sample> </Participant> </Participants> <DataCollectors> <DataCollector> <PartyType>Nurse</PartyType> <Organization>String</Organization> <NameAddress> <FirstName>Jennifer</FirstName> <LastName>Larson</LastName> </NameAddress> <URL>http://www.ucsf.edu</URL> </DataCollector> </DataCollectors> </Study> <Interventions></Interventions> </CarePathway>
  • 18. emPATH Care Pathways 18 UCSF 10/18/2011 Interventions An intervention typically represents a goal as defined by a care provider. For example, there may be an intervention to represent a patient losing fifty pounds. An intervention consists of one or more interactions. Interventions are useful for grouping interactions under a common ontology. Interventions provide information that can be shared and accessible by all encompassing interactions. Interventions may also contain embedded interventions. This is useful to abstract out collections of interactions. The following XML example shows the general structure of an intervention consisting of interactions and embedded interventions. Interactions are always executed in order of appearance in the XML unless changed by Decision Point constructs (discussed later): <Interventions> <Intervention> <Interactions> <Interaction></Interaction> <Interaction></Interaction> <Interaction></Interaction> <Intervention></Intervention> //Embedded </Interactions> </Intervention> </Interventions> The rendering engine can invoke an intervention explicitly. This is typically the case when there is a Human-in-Loop and the wearer is requesting the intervention. For example, if the intervention represents a survey. If the intervention is autonomous, the intervention will have an activation condition that indicates to the emPATH engine when the intervention should start execution. For example, suppose that the care provider wants their care pathway to execute when the patient's esophageal pH level rises about 6.0. The XML would look as follows: <Interventions> <Intervention> <Active> <Condition lang="clips"> (sensor pH ?val)(test (> ?val 6)) </Condition> </Active> </Intervention> </Interventions> An intervention may also contribute to the wearer’s world model as part of the execution process. This is useful for setting global information that encompassing interactions need during their processing. For example, suppose that the intervention contains a number of interactions that monitor patient status and alerts the care provider of any issues. The intervention would like to set up the various thresholds used by the alert detection interactions. The XML would look as follows: <Interventions> <Intervention> <WorldModel> <Fact lang="clips">(pH alert 6)</Fact> <Fact lang="clips">(heartRate alert 170)</Fact> <Fact lang="clips">(weight alert 250)</Fact> </WorldModel> </Intervention> </Interventions>
  • 19. emPATH Care Pathways 19 UCSF 10/18/2011 The care pathway designer can request that a behavior execute before and/or after the execution of an intervention. Behaviors that execute before an intervention can be used to set up environments, initialize constructs, send notifications to providers, and general setup. Executing behaviors after an intervention can be used to clean up environments, send completion notifications, and general cleanup. An example XML look as follows: <Interventions> <Intervention> <PreBehavior> http://www.carethings.com/behavior#SetupAlerts </PreBehavior> <PostBehavior> http://www.carethings.com/behavior#FlushAlerts </PostBehavior> </Intervention> </Interventions> A major goal for care pathways executing on mobile devices are real-time interventions. To support real-time interventions, emPATH supports the specification of real-time constraints for interventions and interactions. For example, suppose we wish that the intervention must complete within two hours due to health response requirements. An example XML would look as follows: <Interventions> <Intervention> <RealTimeConstraints> <MaxDwell>120</MaxDwell> </RealTimeConstraints> </Intervention> </Interventions> Finally, certain care pathways may require execution at defined times during a care regime. For example, the care provider may wish the care pathway to execute every other day at 10:00 AM for daily exercise. Or the provider may wish the care pathway to execute at 8:00 PM every day so the patient can enter daily diary data. The following XML is an example where the time constraint is applied only on the 3rd week and 4th day of the clinical study. <Interventions> <Intervention> <RealTimeConstraints> <Schedule> <Week>3</Week> <Day>4</Day> <AtMost>OnceADay</AtMost> <TimeRange> <DateTimeFrom>11</DateTimeFrom> <DateTimeTo>19</DateTimeTo> </TimeRange> </Schedule> <RealTimeConstraints> </Intervention> </Interventions>
  • 20. emPATH Care Pathways 20 UCSF 10/18/2011 Interactions Interactions represent the individual steps or tasks required to achieve a goal. Each interaction has a "type" which indicates the intended meaning of the interaction. An interaction does not necessarily imply a conversation with the wearer. Interactions may occur autonomously through the application of behaviors. In addition, an interaction may be ignored completely because of the wearer’s status. For example, an interaction to "turn on oxygen pump" is ignored if the patient is not having an episode. emPATH supports the following types of interactions: • Autonomous - the interaction does not require an external artifact (sensor, data source, or wearer). Autonomous interactions are typically processed by behaviors. • Interrogative Multiple Selection - the interaction is querying an external artifact for information. The artifact may be a sensor, data source, or the wearer. The interaction is expecting a set of data (more than one piece of information). The set typically consists of RDF triples. • Interrogative with Single Selection - the interaction is querying an external artifact for information. The artifact may be a sensor, data source, or the wearer. The interaction is expecting only one piece of data typically in the form of an RDF triple. • Interrogative with Unstructured response - the interaction is querying an external artifact for information. The artifact may be a sensor, data source, or the wearer. The interaction is expecting one piece of unstructured data, which may include text, photos, video, or audio. • Imperative - the interaction is expressing a command or request to the wearer. This may be to instruct the wearer (for example, "exit the room") or command an inanimate object (for example, "turn on oxygen pump"). • Declarative - the interaction is expressing something to the wearer. For example, "job well done!". For example, the following XML represents an interaction to ask the wearer how they feel: <Interaction> <Text>How do you feel?</Text> <SystemID>http://www.carethings.com/sysId#60bd</SystemID> <Type>InterrogativeSingleSelect</Type> <Responses> <Response> <Label>Good</Label> </Response> <Response> <Label>Fair</Label> </Response> <Response> <Label>Poor</Label> </Response> </Responses> </Interaction> Interactions are quite often used to communicate with the wearer. This is known as a "human-in- loop" (HIL) because the wearer is directly involved with the interaction. emPATH provides a number features specifically for HIL interactions. The following features are supported: • Indicate a text message to be shown to the wearer. • Indicate a text resource to be shown to the wearer.
  • 21. emPATH Care Pathways 21 UCSF 10/18/2011 • Indicate an audio message to be played to the wearer. • Indicate an ontology to be sent to the rendering engine. The assumption is that the rendering engine can translate the ontology to the appropriate rendering for the wearer. For example, suppose the interaction needs to determine if the wearer is experiencing pain while eating. The XML would be as follows in which the rendering engine is give the option of displaying simple text, displaying rich text (text resource), playing an audio snippet, and/or providing an ontology: <Interaction> <Text>When you are eating, how intense is your pain?</Text> <TextResource> http://www.carethings.com/utterance#eatingPain.html </TextResource> <AudioResource> http://www.carethings.com/utterance#audioEatingPain.m4a </AudioResource> <Ontology> http://www.carethings.com/utterance#classEatingPain </Ontology> <SystemID>http://www.carethings.com/sysId#60bd</SystemID> <Type>InterrogativeSingleSelect</Type> <Responses> <Response> <Label>Intense</Label> </Response> <Response> <Label>Mild</Label> </Response> <Response> <Label>No Pain</Label> </Response> </Responses> </Interaction> Interactions support many of the same features as interventions including real-time constraints, the ability to update the world model, and conditions that indicate whether the interaction is active. For example, suppose that the interaction is only applicable if the wearer has breast cancer. The XML for the interaction would look as follows. If the wearer did not have breast cancer then the interaction would be skipped. <Interactions> <Interaction> <Active> <Condition lang="clips"> (patient affliction breastCancer> </Condition> </Active> </Interaction> </Interactions> Interactions that support a behavior are said to be autonomous. The behavior is mandated to execute the interaction. Once the behavior completes processing, the interaction is complete. The wearer never sees the interaction and hence does not need to respond to the interaction. The goal of second-generation mobile solutions is to be predominately autonomous. It will be difficult for second generation and beyond mobile solutions to be HIL since wearers are restricted
  • 22. emPATH Care Pathways 22 UCSF 10/18/2011 in their available time to react to mobile applications. The XML for an autonomous interaction would look as follows. If multiple behaviors are indicated, the behaviors are executed in the order they appear in the XML. <Interactions> <Interaction> <PathwayBehaviors> <Behavior> http://www.carethings.com/behavior#AnalyzeData </Behavior> <Behavior> http://www.carethings.com/behavior#NotifyProvider </Behavior> </PathwayBehaviors> </Interaction> </Interactions> Responses Responses represent system-accepted ways of responding to interactions. Responses are typically only used for HIL interactions. The emPATH framework makes no assumptions on how the responses are used by the rendering engines. Interactions are not required to have responses. When responses are available, rendering engines are not required to display the responses verbatim to the wearer. A response may represent doing work if selected by the wearer such as enabling the on-board camera. The type of the response indicates the intended behavior. Accepted response types include: • Free - the response consists of unstructured text with no data limit. • FreeFixed - the response consists of unstructured text but limited to some specific number of characters. Currently the rendering engine defines the limit. • Directive - the response is considered an "instruction" to the wearer. • Fixed - the response represents a choice such as a check box. • FixedNext - the response represents a choice and selection by the wearer results in the care pathway continuing execution to the next interaction. • VAS - the response represents a Visual Analog Scale (VAS). • DVAS - the response represents a Digital Visual Analog Scale. • Camera - the response, if selected, results in the mobile device activating the on-board camera. • Video - the response, if selected, results in the mobile device activating the on-board video camera. • Sensor - the response, if selected, results in the mobile device activating the associated sensor. • Scan - the response, if selected, results in the mobile device activating the on-board camera. The resulting photo is assumed to contain a barcode. On-board barcode scanning software interprets the photo and the resulting barcode, if successful, is stored in the response object. • OCR - the response, if selected, results in the mobile device activating the on-board camera. The resulting photo is assumed to contain text. On-board OCR software interprets the photo and the resulting text is stored in the response object.
  • 23. emPATH Care Pathways 23 UCSF 10/18/2011 The response construct contains a number of features that are intended for HIL interaction. Given the assumption that a response will be visualized on the mobile device for wearer interaction, the following features are supported. • Value constraints - a list of values from which a wearer can choose the appropriate value. For example, the wearer may be asked for their zip code. The value constraints would list applicable zip codes in the wearer's immediate area. • Code value - typically a study recognized identifier for the response. Provides the ability for a researcher to use the pathway in conjunction with existing research software. • Label - text displayed by the rendering engine for the response. The rendering engine is free to use the text in any way or manner. • Label Resource - sometimes the response requires more expressive ways to communicate with the wearer. The resource is responsible for representing the label resource. For example, the resource may be an HTML page. The rendering engine will then render the HTML page as oppose to simple text. • Format - indicates the format of the response. This is only applicable for responses of type Free or FreeFixed. The rendering engine is responsible for supporting the format information. Currently supported formats include numeric, alpha, alphanumeric, date, datetime, monetary, and phone. Using a previous example, suppose the interaction needs to determine if the wearer is experiencing pain while eating. The XML would be as follows: <Interaction> <Text>When you are eating, how intense is your pain?</Text> <SystemID>http://www.carethings.com/sysId#60bd</SystemID> <Type>InterrogativeSingleSelect</Type> <Responses> <Response> <Label>Intense</Label> <Code>STUDY1002-INTENSE</Code> </Response> <Response> <Label>Mild</Label> <Code>STUDY1002-MILD</Code> </Response> <Response> <Label>No Pain</Label> <Code>STUDY1002-NOPAIN</Code> </Response> </Responses> </Interaction> Transitions Transitions in emPATH are supported in three ways: "skip-to" instructions, decision points, and interaction transitions. Skip-to instructions are provided within the response constructs of care pathways. The instructions indicate which intervention to transition to if the response is selected. This is the simplest form of transitioning supported by emPATH. The wearer is indirectly deciding the transition for the pathway by selecting a response. emPATH provides the response element
  • 24. emPATH Care Pathways 24 UCSF 10/18/2011 <SkipTo> to indicate transitioning. The following XML shows the use of the <SkipTo> element for transitioning. <Interaction> <Text>When you are eating, how intense is your pain?</Text> <SystemID>http://www.carethings.com/sysId#60bd</SystemID> <Type>InterrogativeSingleSelect</Type> <Responses> <Response> <Label>Intense</Label> <SkipTo>http://www.carethings.com/sysId#60ccc</SkipTo> </Response> <Response> <Label>Mild</Label> <SkipTo>http://www.carethings.com/sysId#60ccc</SkipTo> </Response> <Response> <Label>No Pain</Label> <SkipTo>http://www.carethings.com/sysId#60cff</SkipTo> </Response> </Responses> </Interaction> Interaction skip-to's are accomplished the same way except that the instruction appears within the <Interaction> element as follows: <Interaction> <Text>When you are eating, how intense is your pain?</Text> <SystemID>http://www.carethings.com/sysId#60bd</SystemID> <Type>InterrogativeSingleSelect</Type> <SkipTo>http://www.carethings.com/sysId#60ccc</SkipTo> </Interaction> The skip-to is followed once the interaction has completed processing. If both response skip-to instructions and interaction skip-to instructions exist, the response instructions take precedence. Decision points are specific junctures within a care plan that a decision is to be made to determine the next path in the pathway. Decision points use the emPATH blackboard to derive the decision. Influencing a decision point can be done by manipulating the blackboard. Decision points consist of one or more decisions and the resulting interactions. There are four types of decision points: • And Split: This is a decision in which more than one path in the care pathway can be chosen. Typically each path executes in parallel. • Or Split: This is a decision in which only one path is chosen in the care pathway. • And Join: This is a decision point where the decision is waiting for more than one path in the care pathway to complete. • Or Join: This is a decision point where the decision is waiting for only one path in a care pathway to complete. A decision point represented in XML looks very much like an interaction. The following XML shows an And-Split decision point. The list of possible transitions are expressed using the <Transition> element. <DecisionPoint>
  • 25. emPATH Care Pathways 25 UCSF 10/18/2011 <ID>http://www.carethings.com/userId#DP1</ID> <Ontology> http://www.carethings.com/ontology/painValue5 </Ontology> <SystemID>http://www.carethings.com/sysId#35ae3d</SystemID> <Type>AndSplit</Type> <Transitions> <Transition> <SystemID>http://www.carethings.com/sysId#2333238</SystemID> <To>http://www.carethings.com/userId#DP1</To> <Active> <Condition> (?val realtive oralPain) (test exists) </Condition> </Active> </Transition> <Transition> <SystemID>http://www.carethings.com/sysId#2333238</SystemID> <To>http://www.carethings.com/userId#DP1</To> <Active> <Condition> (?val realtive oralPain) (test not exists) </Condition> </Active> </Transition> </Transitions> </DecisionPoint> The join decision points does not require any special XML constructs since their behavior is provided within emPATH. Only the type value is required for any type of join decision point. An example And-Join decision point would look as follows. <DecisionPoint> <ID>http://www.carethings.com/userId#DP1</ID> <Ontology> http://www.carethings.com/ontology/painValue5 </Ontology> <SystemID>http://www.carethings.com/sysId#35ae3d</SystemID> <Type>AndJoin</Type> </DecisionPoint> The final type of transition involves transitions from within interactions. These transitions do not involve responses. Transitions from an interaction are used for autonomous interactions. The following is an example XML: <Interaction> <Text>When you are eating, how intense is your pain?</Text> <SystemID>http://www.carethings.com/sysId#60bd</SystemID> <Type>Autonomous</Type> <Transitions> <Transition> <SystemID>http://www.carethings.com/sysId#2333238</SystemID> <To>http://www.carethings.com/userId#DP1</To> <Active> <Condition> (?val realtive oralPain) (test exists) </Condition> </Active> </Transition> <Transition> <SystemID>http://www.carethings.com/sysId#2333238</SystemID>
  • 26. emPATH Care Pathways 26 UCSF 10/18/2011 <To>http://www.carethings.com/userId#DP1</To> <Active> <Condition> (?val realtive oralPain) (test not exists) </Condition> </Active> </Transition> </Transitions> </Interaction> Behaviors It would difficult to model all aspects of a care pathway within an XML definition. Care pathways need to execute algorithms that cannot be described through XML. For these situations, emPATH defines behaviors. Behaviors are assumed to execute within the mobile device. For example, for Apple devices such as the iPhone and iPad, a behavior is defined as an Objective-C class. For Android-based devices, a behavior is defined as a Java class. For J2ME-based phones, behaviors are J2ME Java classes. Behaviors can be referenced within interactions, interventions, and responses. There are three points at which a behavior can be invoked in an emPATH resource: • Before processing of the resource (pre-processing). • During processing of the resource (autonomous). • After processing of a resource (post-processing). For example, suppose the care pathway would like to ask the wearer for their level of pain but would like to have the question reference the last pain value derived from the on-board personal health record. The behavior LastPainResponse constructs the appropriate question and adds it to the interaction. <Interaction> <Text>Your last pain level was XXXX. What is your current level?</Text> <PreBehavior>LastPainResponse</PreBehavior> <Type>InterrogativeSingleSelect</Type> <Responses> <Response> <Label>Most intense pain</Label> </Response> <Response> <Label>Tolerable</Label> </Response> <Response> <Label>No pain</Label> </Response> </Responses> </Interaction>
  • 27. emPATH Care Pathways 27 UCSF 10/18/2011 Example Care Pathways This section will define a number of care pathways using the emPATH supported XML protocol. The intent is to allow pathway designers to construct and deploy emPATH care pathways quickly. The XML examples are defined to be complete and fully functional. Example 1: Pain Survey (2 Questions) <CarePathway> <Interventions> <Intervention> <SystemID>http://www.carethings.com/sysId#d2c157</SystemID> <Interactions> <Interaction> <ID>http://www.carethings.com/userId#Q1</ID> <Text>When you ARE NOT talking, eating, or drinking, how intense (severe, strong) is the pain in your mouth? </Text> <SystemID>http://www.carethings.com/sysId#35ae3d</SystemID> <Type>InterrogativeSingleSelect</Type> <ResponseRequired>YES</ResponseRequired> <Responses> <Response> <SystemID> http://www.carethings.com/sysId#285510 </SystemID> <Type>Fixed</Type> <Label>No pain</Label> </Response> <Response> <SystemID> http://www.carethings.com/sysId#2222610 </SystemID> <Type>Fixed</Type> <Label>No pain</Label> </Response> <Response> <SystemID> http://www.carethings.com/sysId#266610 </SystemID> <Type>Fixed</Type> <Label> The most intense pain sensation imaginable </Label> </Response> </Responses> </Interaction> <Interaction> <ID>http://www.carethings.com/userId#Q2</ID> <Text>When you ARE talking, eating, or drinking, how
  • 28. emPATH Care Pathways 28 UCSF 10/18/2011 intense (severe, strong) is the pain in your mouth? </Text> <SystemID>http://www.carethings.com/sysId#f992c7</SystemID> <Type>InterrogativeSingleSelect</Type> <ResponseRequired>YES</ResponseRequired> <Responses> <Response> <SystemID> http://www.carethings.com/sysId#285444 </SystemID> <Type>Fixed</Type> <Label>No pain</Label> </Response> <Response> <SystemID> http://www.carethings.com/sysId#28610 </SystemID> <Type>Fixed</Type> <Label>The most intense pain sensation imaginable </Label> </Response> </Responses> </Interaction> <Interaction> <ID>http://www.carethings.com/userId#iFinalInteraction</ID> <Text/> <Type>Imperative</Type> <PathwayBehaviors> <Behavior> http://www.carethings.com/sysBehavior/className#Final </Behavior> </PathwayBehaviors> </Interaction> </Interactions> </Intervention> </Interventions> </CarePathway> Example 2: Pain Survey with Conditional Transitions (3 Questions) <CarePathway> <Interventions> <Intervention> <SystemID>http://www.carethings.com/sysId#d2c157</SystemID>
  • 29. emPATH Care Pathways 29 UCSF 10/18/2011 <Ontology> http://www.carethings.com/ontology/mainInteraction </Ontology> <Interactions> <Interaction> <ID>http://www.carethings.com/userId#Q1</ID> <Text>When you ARE NOT talking, eating, or drinking, how intense (severe, strong) is the pain in your mouth? </Text> <Ontology> http://www.carethings.com/ontology/painValue5 </Ontology> <SystemID>http://www.carethings.com/sysId#35ae3d</SystemID> <Type>InterrogativeSingleSelect</Type> <ResponseRequired>YES</ResponseRequired> <Responses> <Response> <SystemID> http://www.carethings.com/sysId#254610 </SystemID> <Type>Fixed</Type> <Label>No pain</Label> <WorldModel> <Fact> (patient painlevel nopain) </Fact> </WorldModel> </Response> <Response> <SystemID> http://www.carethings.com/sysId#243610 </SystemID> <Type>Fixed</Type> <Label> The most intense pain sensation imaginable </Label> <WorldModel> <Fact> (patient painlevel intense) </Fact> </WorldModel> </Response> </Responses> <Transitions> <Transition> <SystemID> http://www.carethings.com/sysId#233238 </SystemID> <Active> <Condition> (patient painlevel nopain) </Condition> </Active> <To>http://www.carethings.com/sysId#f992c7</To> </Transition> <Transition> <SystemID> http://www.carethings.com/sysId#2333238 </SystemID> <Active> <Condition> (patient painlevel intense)
  • 30. emPATH Care Pathways 30 UCSF 10/18/2011 </Condition> </Active> <To>http://www.carethings.com/sysId#f99444</To> </Transition> </Transitions> </Interaction> <Interaction> <ID>http://www.carethings.com/userId#Q2</ID> <Text>When you ARE talking, eating, or drinking, how intense (severe, strong) is the pain in your mouth? </Text> <Ontology> http://www.carethings.com/ontology/interaction </Ontology> <SystemID>http://www.carethings.com/sysId#f992c7</SystemID> <Type>InterrogativeSingleSelect</Type> <ResponseRequired>YES</ResponseRequired> <Responses> <Response> <SystemID> http://www.carethings.com/sysId#28 </SystemID> <Type>Directive</Type> <Label>No pain</Label> </Response> <Response> <SystemID> http://www.carethings.com/sysId#28610 </SystemID> <Type>Directive</Type> <Label>The most intense pain sensation imaginable </Label> </Response> </Responses> </Interaction> <Interaction> <ID>http://www.carethings.com/userId#iFinalInteraction</ID> <Text/> <Type>Imperative</Type> <PathwayBehaviors> <Behavior> http://www.carethings.com/sysBehavior/className#Final </Behavior> </PathwayBehaviors> </Interaction> <Interaction> <ID>http://www.carethings.com/userId#Q3</ID> <Text>When you ARE NOT talking, eating, or drinking, how sharp (like a knife) is the pain in your mouth? </Text> <Ontology> http://www.carethings.com/ontology/interaction </Ontology> <SystemID>http://www.carethings.com/sysId#f99444</SystemID> <Type>InterrogativeSingleSelect</Type> <ResponseRequired>YES</ResponseRequired> <Responses> <Response> <SystemID>
  • 31. emPATH Care Pathways 31 UCSF 10/18/2011 http://www.carethings.com/sysId#28 </SystemID> <Type>Directive</Type> <Label>No pain</Label> </Response> <Response> <SystemID> http://www.carethings.com/sysId#28610 </SystemID> <Type>Directive</Type> <Label>The most intense pain sensation imaginable </Label> </Response> </Responses> </Interaction> <Interaction> <ID>http://www.carethings.com/userId#iFinalInteraction</ID> <Text/> <Type>Imperative</Type> <PathwayBehaviors> <Behavior> http://www.carethings.com/sysBehavior/className#Final </Behavior> </PathwayBehaviors> </Interaction> </Interactions> </Intervention> </Interventions> </CarePathway> Example 3: Pain Survey with Iteration In this example, we are going to list the family members known to the system and then ask the wearer which of those family members have oral pain. The wearer will then be asked a pain question for each member. <ClinicalPathway> <Interventions> <Intervention> <SystemID>http://www.carethings.com/sysId#d2c157</SystemID> <Ontology> http://www.carethings.com/ontology/mainInteraction </Ontology> <Interactions> <Interaction> <ID>http://www.carethings.com/userId#Q1</ID> <Text>Which of the following members of your family are having oral pain? </Text>
  • 32. emPATH Care Pathways 32 UCSF 10/18/2011 <Ontology> http://www.carethings.com/ontology/painValue5 </Ontology> <SystemID>http://www.carethings.com/sysId#35ae3d</SystemID> <Type>InterrogativeSingleSelect</Type> <ResponseRequired>YES</ResponseRequired> <PreBehavior> http://www.carethings.com/behavior#SetupFamilyMembers </PreBehavior> <Responses> <!-- The response list is generated by the preBehavior: SetupFamilyMembers. The following is an example of a generated response --> <Response> <SystemID> http://www.carethings.com/sysId#28 </SystemID> <Type>Fixed</Type> <Label>Janet Farmer</Label> <WorldModel> <Fact lang="clips"> (JanetFarmer relative oralPain) </Fact> </WorldModel> </Response> </Responses> </Interaction> <DecisionPoint> <ID>http://www.carethings.com/userId#DP1</ID> <Ontology> http://www.carethings.com/ontology/painValue5 </Ontology> <SystemID>http://www.carethings.com/sysId#35ae3d</SystemID> <Type>OrSplit</Type> <Transitions> <Transition> <SystemID> http://www.carethings.com/sysId#2333238 </SystemID> <To>http://www.carethings.com/sysId#f111c7</To> <Active> <Condition> (?val realtive oralPain) (test exists) </Condition> </Active> </Transition> <Transition> <SystemID> http://www.carethings.com/sysId#2333238 </SystemID> <To>http://www.carethings.com/sysId#a112c7</To> <Active> <Condition> (?val realtive oralPain) (test not exists) </Condition> </Active> </Transition> </Transitions> </DecisionPoint> <Interaction> <ID>http://www.carethings.com/userId#Q2</ID>
  • 33. emPATH Care Pathways 33 UCSF 10/18/2011 <Text>When {? relative oralPain} IS talking, eating, or drinking, how intense (severe, strong) is is the pain in their mouth </Text> <Ontology> http://www.carethings.com/ontology/interaction </Ontology> <SystemID>http://www.carethings.com/sysId#f111c7</SystemID> <Type>InterrogativeSingleSelect</Type> <ResponseRequired>YES</ResponseRequired> <Responses> <Response> <SystemID> http://www.carethings.com/sysId#28 </SystemID> <Type>Fixed</Type> <Label>No pain</Label> </Response> <Response> <SystemID> http://www.carethings.com/sysId#28610 </SystemID> <Type>Fixed</Type> <Label>The most intense pain sensation imaginable </Label> </Response> </Responses> </Interaction> <DecisionPoint> <ID>http://www.carethings.com/userId#DP2</ID> <SystemID>http://www.carethings.com/sysId#35383d</SystemID> <Type>AndSplit</Type> <Transitions> <Transition> <SystemID> http://www.carethings.com/sysId#2333238 </SystemID> <To>http://www.carethings.com/sysId#35ae3d</To> </Transition> </Transitions> </DecisionPoint> <Interaction> <ID>http://www.carethings.com/userId#Q3</ID> <Text>When you ARE NOT talking, eating, or drinking, how sharp (like a knife) is the pain in your mouth? </Text> <Ontology> http://www.carethings.com/ontology/interaction </Ontology> <SystemID>http://www.carethings.com/sysId#a112c7</SystemID> <Type>InterrogativeSingleSelect</Type> <ResponseRequired>YES</ResponseRequired> <Responses> <Response> <SystemID> http://www.carethings.com/sysId#28 </SystemID> <Type>Fixed</Type> <Label>No pain</Label> </Response> <Response> <SystemID>
  • 34. emPATH Care Pathways 34 UCSF 10/18/2011 http://www.carethings.com/sysId#28610 </SystemID> <Type>Fixed</Type> <Label>The most intense pain sensation imaginable </Label> </Response> </Responses> </Interaction> <Interaction> <ID>http://www.carethings.com/userId#iFinalInteraction</ID> <Text/> <Type>Imperative</Type> <PathwayBehaviors> <Behavior> http://www.carethings.com/sysBehavior/className#Final </Behavior> </PathwayBehaviors> </Interaction> </Interactions> </Intervention> </Interventions> </ClinicalPathway>