mHealth Israel_ Digital Medicine_Whitepaper_The Digital Medicine Chrystal Ball
The Ethics of Wearables
1. The Ethics of
Wearables:
The Crucial Conversation
Few Are Having
Ethics and health care go hand in hand. And now, with the rapid growth of
wearables, practitioners need an ethical framework that encompasses these
devices—connected or not. But, to date, no such standard exists. Many questions
remain unanswered, such as how implications change according to the type of
patient or type of gadget. As wearables filter more and more into practitioners’
consciousness, it’s essential to understand and weigh the ethical considerations.
Empowering Patients. Innovating Outcomes.
REPORT
2. PAGE
2
W
earables grow more sophisticated by the day. They don’t just track heart rates and steps
walked anymore—some now interact with the brain, while others soon will dispense
medication. Many feed sensitive personal biometric data across the Internet into mobile apps.
That capability, of course, raises the thorny twin issues of privacy and security, areas the media,
science and academia continue to explore. But what of the ethical implications of these wearables,
which patients can use for medical purposes? And how might the implications change according to the
type of patient or the type of gadget—for example, whether it provides basic pain relief or whether it
also sends information to an electronic medical record (EMR) or insurance company? These questions
have so far eluded the larger public conversation. Sources agree: No standard exists that addresses,
in specific, health wearables and their ethical considerations. But such a principle will prove essential
as more wearables filter into clinicians’ consciousness, and as more insurers require such gadgets for
wellness incentives and lower premiums. This report aims to create a standards framework, one by
which practitioners and payers alike can act, and highlight the gaps that neither party should overlook.
Viewing Wearables Through
A Philosophical Lens
Right off the bat, one might wonder whether the ethics of
health wearables vary according to philosophical bent—
say, Kant, Aristotle or Hobbes. Leading health care thinkers
present a mixed bag of answers. Paul Ford, Ph.D., director
of the Cleveland Clinic’s NeuroEthics Program, falls less
on the side of philosophy than of gauging responsibility,
which changes depending upon whether a practitioner
works as a doctor for a football team or serves individual
patients, he said. “That’s the lens I think is important.”
In other words, if a team doctor must rehab a player so
he can get back on the field, a wearable might force the
sharing of sensitive data with a number of people. But
a private practitioner may not want to risk a patient’s
confidential information landing in the wrong hands.
Meanwhile, Anne Lara, RN, chief information officer for
Union Hospital of Cecil County, said health wearable ethics
do not change, regardless of philosophical tradition. “I
think what changes is how you engage,” she said. “If
you think historically and traditionally in health care, the
practitioner’s always been the captain of the ship. [With
wearables,] it depends upon how the practitioner wants
to engage the consumer in his or her own plan of care.”
Jan Oldenburg, senior manager in Ernst & Young’s
Advisory Health Care Practice, made a similar point.
“It’s the degree to which the practitioner is interested
in, and respects, patients and their data,” she said. For
instance, some practitioners would not accept a patient’s
wearable-generated blood pressure reading because it did
“You want to make sure
that any technology
a physician recommends
is going to enrich the
patient’s life.”
– Cleveland Clinic’s Paul Ford
REPORT
By Kelly Teal, Senior Editor
The Ethics
of Wearables:
The Crucial Conversation
Few Are Having
3. PAGE
3
not happen in the doctor’s office. “So the degree of trust
is not only in patients as participants in their own care,
but also in the devices they’re using,” Oldenburg said.
Finally, one professor said yes, the ethics of wearables
shift according to philosophical take, although, of course,
the philosophers themselves have not addressed these
technologies. However, whether Kant, for instance,
would support wearables does not settle the ethics
question, said Harald Schmidt, Ph.D., assistant professor
and research associate within the Perelman School of
Medicine at the University of Pennsylvania. Instead, he
said, “An overarching tradeoff…that different moral
theories all address in some way, is that between
respecting autonomy (typically associated with Kant) and
maximizing overall utility or consequences (generally
favored by utilitarianism). Due to their potential for intrusive
surveillance, wearables pose a threat to autonomy, even
though overall utility of either nudging or requiring people
to use them can be helpful for a better understanding of a
population’s current health and progress in improving it.”
As Oldenburg noted, existing structures can help build the
ethical standards for wearables. “They might not always be
perfect but they are appropriate starting points,” she said.
Constructing the Ethical Framework
With that in mind, sources say that, above all, any guideline
for recommending a wearable for health must abide
by the Hippocratic Oath’s overarching theme: patient
beneficence. But, of course, the specifics run deeper.
Do No Harm. A key element of “do no harm” means
determining whether a patient will do better with a
wearable than without it. If a device will prevent the
occurrence or relapse of health risks, cut readmissions
costs and take up fewer resources, said Schmidt,
then recommending a patient use a wearable seems
obvious. On the other hand, Ford said, practitioners
should think about the following: Could the wearable
distract the patient from paying attention to an important
task such as driving? Could any buzzes and beeps go
against the patient’s interest, such as needing deep
sleep? “The metric of whether or not it’s implanted is
a good measure of whether it’s risky,” Ford added.
In the meantime, don’t forget that “do no harm” includes
the apps tied to these wearables. Harm can stem from
an inaccurate or improperly performing app or its
associated device, said Adam C. Powell, Ph.D., president
of consultancy Payer+Provider Syndicate. “If an app is
trying to measure someone’s heart rate and the reading is
miscalibrated, the doctor may make the wrong decision.
It’s important that these things work as intended.”
This could mean recommending only devices that have
received FDA clearance. And right now, there aren’t many,
Schmidt said. “There is currently no office to regulate
[connected health devices] and related applications, and
only 100 of around 90,000 health apps have been reviewed
by the FDA,” Schmidt wrote in a draft manuscript, “The
ethics of remote monitoring through connected devices:
When, if ever, should their use be required?,” presented
at a University of Pennsylvania conference last year.
More than Medicine. Next, when it comes to wearables,
a practitioner must do more than make the best medical
decision for patients, said Steven Steinhubl, M.D.,
director of digital medicine for Scripps Translational
Science Institute. That means accounting for privacy
and emotional needs. One of the biggest gaps clinicians
could overlook “is not fully recognizing the potential
downside of personal monitoring, such as undue anxiety,”
Steinhubl said. “All people will have a different response
to self-monitoring and we need to be cognizant of that,
and be prepared to deal with it—ideally, proactively.”
Along those lines, one UK doctor, Des Spence, wrote in
the British Medical Journal in April 2015 that connected
wearables, which provide around-the-clock monitoring,
could foment “extreme anxiety” among users, particularly
“All people will have a different
response to self-monitoring and we
need to be cognizant of that.”
—Scripps’ Steven Steinhubl
4. PAGE
4
the “worried well.” Ford agreed: “It actually may take
some extra responsibility when recommending these
devices to know how to interpret and safely apply their
data. What’s the mental health impact of these devices?
Is it going to make [a patient] happier or more obsessed?”
If a practitioner works with a patient prone to anxiety,
or who has obsessive-compulsive disorder or similar
conditions, recommending a wearable may not prove
suitable. “You want to make sure that any technology a
physician recommends is going to enrich the patient’s
life and not isolate them further from the things they
enjoy and need as social individuals,” Ford said.
Patient Capacity. Each source said the ethics of wearables
do indeed vary by patient type—fetal, child, adolescent,
adult, elderly, mentally or physically handicapped,
with Schmidt perhaps putting the matter in the most
succinct terms: “Elements of surveillance and control
require informed consent—and these groups differ in
their capacity to consent.” Experts agreed that following
existing informed consent practices should work just
fine for recommending wearables. At the same time,
insurers, too, must have the flexibility to accommodate
customers who might not have the means—physical,
mental, emotional or financial—to comply with wearables
usage. Not everyone can walk, for instance, or afford
the time or money away from work and family to meet
requirements for earning premium reductions. “The fact
that people have different baseline conditions matters,
especially for penalty-based incentives,” Schmidt said.
Security. Reminders of data insecurity crop up every
day in the form of news reports about breached banks,
insurers, retailers. Hackers want their hands on personal
information they can exploit. And what qualifies as more
personal than medical data? To that end, practitioners
and insurers must go as far as possible to recommend
devices that adhere to the strictest security codes. Here,
clinicians must do more to boost their knowledge on this
front, no small mandate. To ease the burden, think about
hiring an IT expert who can vet devices for compliance with
HIPAA and other privacy standards; make sure the data
integrate into EMRs; and run an airtight internal network.
This person could join the employee rolls or she could
handle projects as third party. Some such consultants, or
“channel partners,” charge retainers, while others charge
by the hour or the assignment. Yes, this costs on the front
end, but calculate the expense of a lawsuit or an audit, and
then compare that to the numbers for hiring or contracting.
An IT guru would prove invaluable. Practitioners would not
have to take on a task outside of their training or interest,
and they would ensure the security, as much as possible,
of their patients’ data. After all, the practice of medicine
more and more will involve connected wearables, and their
implantable and ingestible counterparts. Accepting and
planning for that reality now seems like the ethical move.
Privacy, Confidentiality. Similar to security, the need
for privacy and confidentiality assurances almost goes
without saying. Almost, since the directives bend according
to patient type, as discussed above. And the consent
precedents in place for patients including children,
adolescents, the elderly and people with mental disabilities
should help guide the process for gating wearables privacy
and confidentiality, said Oldenburg. “Some of those same
rules might apply,” she said. For instance, children 12
and older can make certain medical decisions in private
while other decisions require a legal guardian present,
she said. And when it comes to wearables, someone
has to have access to the data generated; depending
on the patient’s age and capacity, the information may
not belong just to that person. “A child may not be
able to have his or her own account but a parent might
be able to have an account on behalf of that child,”
Oldenburg added. “Both consent to it…and monitor it.”
“I do think transparency has
to be a really, really key value
and a key part of building an
ethical framework.”
—Ernst & Young’s Jan Oldenburg
5. PAGE
5
Level of Intrusion. Last of all, consider that the ethics of
health wearables vary by the kind of device and the level of
intrusion it imposes. Intrusion from a pain-relief wearable,
for example, differs from the intrusion of a glucometer
that wants to share location and heart-rate data. Ford
uses the example of Google Glass, Google’s wearable that
contains a video camera and an Internet connection. Sure,
the product comes in handy in the operating room but
what about someone wearing it around family and friends?
What if a practitioner-recommended wearable records
people other than the patient without their knowledge or
consent? All in all, “It is generally desirable to minimize
intrusion as much as possible,” Schmidt wrote in his paper.
Identifying the Gaps
The above comprises a basic ethics framework but it
contains some gaps, as the health wearables sector
remains nascent and advancements seem to come
to light each week. Identifying all of the holes will
take some time. “This is part of a longer dialogue
that’s already underway about patient empowerment
and patient choice in health care,” Oldenburg said.
Still, some of the concerns stand out now, giving
practitioners and insurers more points to ponder. The
takeaway? Black-and-white verdicts do not exist.
Who’s Liable? In a sue-happy society, “Who’s
liable?” often arises as the primary question. The
same will apply for practitioners thinking about
recommending health wearables. Expect no clear-
cut answers here, as the issues have not yet been
tested in courts; therefore, they lack a precedent. That
will change but for now, consider the following:
If a practitioner tells a patient to wear a gadget
such as a Fitbit and tally steps and weight, then that
clinician should have the wherewithal to ask for or
monitor the data generated, Lara said. Otherwise, that
doctor or nurse could invite a lawsuit, she said. “It’s
almost like prescribing, and if I forget to monitor, then
there’s more of a responsibility there,” she said.
For Oldenburg, the question remains a bit more
nebulous. “I know people worry if they have a
continuous stream of data from a patient, what are
their obligations? If something happens, am I liable
just because I had access to the data?…The liability
is, I think, a very real concern. Because there just isn’t
case law about that and we don’t have frameworks
for thinking about what this means in context.”
But Powell said existing research could help answer
the question of liability and serve as a guide.
“Physicians recommend…all sorts of things that
aren’t necessarily regulated products,” he said.
“This perhaps falls in the same category.”
One solution to vetting devices and their apps
could come in the form of outside groups. These
third parties could conduct reviews and make
suggestions to practitioners, Powell said.
Who Sees the Data? Here, directed consent must come
into play. “I have a strong suspicion that people may
want to share some kinds of data with their practitioners
and some with only a particular practitioner,” Oldenburg
said. Or, a patient may feel all right about a health insurer
receiving the wearable-generated data but uncomfortable
about a life insurer seeing the same information. “So being
really clear about why the data is being collected, how it’s
being used and what any secondary uses of that data are,
as well as the degree to which a person will be anonymized
in any broader data sets versus individualized” all make
up key considerations, she said. “The ideal is the person
can choose to share their data with as many parties in the
health care system as they want, and feels to a person that
whatever they’re using helps them feel more embedded
in a caring community and a caring context,” Oldenburg
said. Amid all of that, though, Schmidt emphasized the
importance of finding a way to keep from overwhelming
practitioners “with data that have only marginal utility.”
“It’s important that these things
work as intended.”
—Payer+Provider’s Adam C. Powell
“It’s almost like prescribing, and
if I forget to monitor, then there’s
more of a responsibility there.”
—Union Hospital’s Anne Lara