Call Girls in Dwarka Mor Delhi Contact Us 9654467111
Computers in Pharmaceutical Formulations - Final.pptx
1. The Ethics of Computing in Pharmaceutical Research,
Computers in market analysis
2. Introduction
There is no doubt that computers, computing technology, and the consequent
information systems have produced ethical challenges and conflicts.
The challenges and conflicts have been presented not only to the practitioner
facing new problems but also to the professional philosopher dealing with
computer use at a conceptual level.
As well, the challenges and conflicts are not only individual, often arising from
practical experience, but also collective, involving judgments regarding policy
and procedure.
These broad observations are no less true for the use of computers generally as
for the use of computers in pharmaceutical research.
4. Privacy
Before the advent and prevalence of computers, intrusions into an
individual’s privacy were largely time- and place dependent.
The intrusion could be done but only on a small scale. As Johnson
[15] notes, however, computers have changed the nature of
intrusion into privacy as well as the scale of intrusion into
privacy.
The result is a demand to rethink privacy and rethink the
framework of applied ethics, especially because the scale of
intrusion may change the qualitative nature of the offense.
5. Privacy
Philosophers have normally thought of the right to privacy as Justice Brandeis
did over a century ago, namely, as “the right to be let alone”
More narrowly defined, the right to privacy is thought of as the right of
individuals to determine the nature, scope, and manner of information revealed
about themselves.
The right to privacy is essentially a matter of an individual’s controlling the
information about himself or herself.
And, as has been pointed out, the supplier normally controls that which he or
she supplies.
Supplying information about an individual should be in the hands of the
individual. It takes little effort to see that computer use in pharmaceutical
research could produce violations of the right to privacy, construed broadly or
narrowly.
6. Privacy
But as philosophers have remarked and courts have ruled, the
right to privacy is not absolute.
In fact, the place and importance of the right to privacy are still
being explored, as the Supreme Court decisions in Roe v. Wade
(1973) and Planned Parenthood v. Casey (1992) demonstrate.
As a result, decisions regarding the right to privacy are very often
driven by context.
Such may be the case with situations involving computer use, the
right to privacy, and pharmaceutical research.
7. Privacy
Philosophers have identified three general aspects with regard to
the right to privacy.
3 intrusion elements involved in rule according to philosophers:
the elements of relevance,
consent, and
method
8. Privacy
Elements of Relevance:
The element of relevance involves the necessity of the intrusion into privacy as
bearing a direct relationship to the matter at hand.
For instance, in employer-employee relationships, the employer may, at times,
investigate work-related problems by encroaching upon the employee’s private
life.
Such “encroachments” must be relevant to the job the employee does.
For matters relating to pharmaceutical research, the most likely problem with
regard to privacy is the possibility of learning more about an individual than
the scope of the research permits.
Generally speaking, such information must be disregarded and destroyed.
9. Privacy
Consent
Assuming that standard codes of conduct, for example, the Nuremburg Code
(1947) and the World Medical Association’s Declaration of Helsinki, are followed
by researchers, the element of consent will already have been satisfied.
In fact, as far as the element of consent to the intrusion into privacy goes, the
medical community’s doctrine of “informed consent” is a very strict application of
the element of consent.
We may note that the specific “informed consent” of an individual human subject
of research may not be adequate to the decisions surrounding data mining.
10. Privacy
Method
The third element involved with possible intrusions into the right to privacy specifies that the
method of inquiring into the private life of an individual be ordinary and reasonable.
This is an area in which technological development has had significant impact: What was
extraordinary and unreasonable in the 1970s has now become standard practice.
For instance, routine preemployment drug-testing of individuals was an unheard-of practice
thirty or forty years ago. Now, preemployment drug-testing is accepted.
The evolution of technology marks, again, an aspect of privacy that suggests a context
dependent right.
The right to privacy protects physical and psychological privacy in as much as those aspects of
privacy are “culturally recognized as private”.
The right to privacy stretches or contracts with cultural notions, and it is a simple fact that
culture changes.
11. Privacy
Method
In fact, as Ware [1] points out, the threats to the right to privacy were viewed in
the 1970s as originating primarily from the government.
The phrase “Big Brother is watching” meant that government officials had
control over information on citizens.
Now, however, a huge information industry has evolved and the biggest threats
derive from private parties.
Pharmaceutical researchers need not and ought not contribute to the supply of
information available about an individual.
Furthermore, given the advent and techniques of data mining, researchers
should take precautions and build prohibitors into research that would prevent
identification of any individual subject of the research.
12. Liability
The topics related to liability and computer use in general are
legal liability,
the duty of honesty,
the nature of contracts,
misrepresentation,
express and
implied warranties, and
negligence
The relevant distinction concerns the nature of software as either a product or a
service. Many of these topics hold little interest for the ethicist investigating
computer ethics.
For instance, legal liability is less important to philosophy than to jurisprudence.
13. Liability
Duty of honesty
Although generally stated, holds a lot of interest for the philosopher for the
particular manner in which the duty of honesty might appear in research
using human subjects.
The duty of honesty governs informed consent with regard to health risks, but
it could also serve as a springboard to inform human subjects of the potential
risks to privacy as well, even if those risks are not well understood.
The distinction between software as a product and software as a service
seems more relevant to research.
The fact that software is sold and used as a prepackaged item means that
strict liability obtains.
Should there be a defect in the software, the manufacturer is held liable.
14. However, and especially with research, software is often written with a specific
purpose in mind.
As such, the programmer is providing a service rather than a product.
In the case of software written specifically for a certain research purpose, the
liability may not fall exclusively on the software provider.
In those situations, it behooves the researcher to be very clear in knowing and
stating his or her purposes to the programmer.
In addition to the increased precision in the communication between the
researcher and the programmer, there will be an increase in the accuracy of the
data involved in the research.
One current controversy in the pharmaceutical industry, in fact, depends on
accuracy, which in turn affects liability.
15. Liability
With the increased importance of accuracy, though, comes an increase in
knowledge about an individual.
If the right to privacy demands protection, then there may need to be strict limits
on who has access to programs, especially programs involving research.
So, not only is there a need for technological “blockers” to protect against
intrusions into programs, policy and procedure must strictly limit access to
programs.
Should no clear procedure be spelled out or no clear policy implemented,
intrusions into programs and stored data largely become the liability of those
most immediately connected to the program and the institution they are part of.
16. Ownership
One of the more philosophically interesting questions surrounding computers is
the question of how to regard software.
We lack clear analogs for programs. Whereas paintings, poetry, music, and
prose have a lot of similarities, computer software does not consistently share
similarities.
The courts have struggled with this question as the question of applicable law
has proved difficult to answer.
So far, various devices have been used to encapsulate and resolve the
question of ownership of software.
And, of course, the question of ownership is circumscribed by the right to
property.
17. Ownership
But if the property is unlike any the world has yet seen, then it is not clear how
such property should be regarded, let alone protected.
In other words, the question of just what sort of property software is has not
been satisfactorily answered, which contributes to the debate on the
“uniqueness question.”
Nonetheless, devices such as copyrights, patents, encryption, trade secrets,
and oaths of confidentiality and standard virtues like trustworthiness and loyalty
have been tried to protect ownership and the right to property.
18. Ownership
Further complicating the matter of software as property and its place in
pharmaceutical research is consideration of the very place of property in health
care.
The value of health, most philosophers agree, is intrinsic.
It exists for itself and for no other reason.
As such, health, like life and liberty, is an important and powerful end or goal.
Ownership of property is a lesser end or goal.
19. Ownership
Conflict can occur: Do property rights protect medical breakthroughs although
great utilitarian gain might be realized by making the medical knowledge
public?
The phrasing of this question pits the right to property against utility.
However, others have cast the conflict in terms of competing notions of justice,
namely, the notions of Rawls and of Nozick.
In short, it may be that ownership of spectacularly useful medical knowledge of
the sort sometimes contained in software may have to yield to utility or to a right
to health care.
Although the matter of ownership seems to turn on an overwhelmingly broad
conceptual question, the concrete reality is that programmers who provide a
service may have some ownership rights over the research and its results.
20. Ownership
In short, not only is there a need to communicate between the researcher and
the programmer for the sake of accuracy and liability, there is a need to resolve
the issue of property rights, too.
It is worth noting that questions pertaining to liability for malfunctioning
programs also depend on the resolution of ownership.
21. Power
Johnson [17] identified the issue of power as a crucial matter for the
development of computer ethics.
Mason [23] made the same point when he identified accessibility as a concern
for people investigating computer ethics.
The issue of power may be important as never before, if Moor [10] is correct.
He has suggested that the computer revolution has now gone through two
distinct stages, namely, the introduction stage and the permeation stage.
He believes the computer revolution is now entering a third stage, the power
stage.
This stage will necessarily deal with the impact of computers on human life
especially in the areas of politics, socialization, and law.
22. Power
While Moor asks for investigation, others have already made judgments.
For instance, Joy [25] argues that limits must be placed on technology and its
development.
Others, for example, Weckert [26] do not share his rather pessimistic and
alarmist view about how technology, especially the technology of computers,
will affect human life.
That there is debate surrounding computers as they affect society and its
members is evidence that attention needs to be paid to this area.
23. Power
After stating that “power” may broadly be construed as any capacity, Johnson
[17] analyzes computer use in terms of several topics, including the matter of
centralization or decentralization of power, computer use as favoring the status
quo, the embedded values in computer use and programming, the impact on
those who have and those who do not have access to computers, the effect
computers may have on alienating people from what is rightfully theirs, and the
place of the computer professional in resolving these matters.
Of special concern to the computer user doing pharmaceutical research are the
matters of computer use as favoring the status quo and the way computers
might exclude groups or have embedded biases.
24. Codes of conduct relevant to the use of
computers
A professional code of conduct serves several purposes: to allow a profession
to regulate itself; to state the agreed-upon values of a profession; to make
members aware of issues to which they might not otherwise be sensitized; and
to provide guidelines for ethical behavior [17].
Pharmaceutical researchers have certain responsibilities and obligations in the
pursuit of their profession.
By applying computers to pharmaceutical research, researchers introduce new
ethical issues in the execution of their research.
The Association for Computing Machinery (ACM), the United States’ largest
organization of computer professionals, was aware of such potential when it
adopted its first Code of Professional Conduct in 1972.
25. Codes of conduct relevant to the use of
computers
Other organizations of computer professionals have also developed codes of
conduct to help guide the behaviors of their members.
The full versions of the current codes of conduct are available from the web
sites.
Even people who are not computer professionals themselves can use these
guidelines to help ensure that they are following ethical computing practices.
We identify several principles here that are most salient to the application of
computers to pharmaceutical research.
Pharmaceutical researchers can apply the following principles to help guide
their behavior in using computers for pharmaceutical research.
ACM principle 2.01 states that one should “provide service in their areas of
competence, being honest and forthright about any limitations of their
experience and education.”
26. Codes of conduct relevant to the use of
computers
Thus researchers who do not have the appropriate expertise in developing
computer applications should involve someone who does.
Even for those who are appropriately qualified, ACM principle 3.10 says one
should “ensure adequate testing, debugging, and review of software and
related documents on which they work.”
For example, most spreadsheet applications contain errors.
27. Codes of conduct relevant to the use of
computers
Principle 3.13, “Be careful to use only accurate data derived by ethical and
lawful means, and use it only in ways properly authorized,” is important
because computer technology makes it very easy to combine data from multiple
sources, or even to collect data in the first place.
Privacy and confidentiality are also important in data management.
Principle 3.14 instructs one to “maintain the integrity of the data, being sensitive
to outdated or flawed occurrences.”
A recent study found that pharmaceutical industry data disclosure practices is
one of the three issues most frequently reported on in a negative manner by the
press.
28. Codes of conduct relevant to the use of
computers
GlaxoSmithKline was recently sued by the state of New York for concealing the
results of clinical trials of paroxetine.
Clinicians, health care institutions, and patients making decisions about the use
of drugs or treatments can make more informed choices with access to all
relevant data.
30. INTRODUCTION
The need to improve the efficiency of the discovery and development process is
reflected most visibly by the high cost of developing new drugs and the pace at
which those costs have outstripped inflation.
Between 1987 and 2001, the cost of developing a single new drug increased
from $138 million to $802 million. Had this increase paced inflation, the
number would have been $318 million (in 2002 dollars)
30
31. INTRODUCTION
1. These figures include direct research costs as well as those of discovery, attrition,
and cost of capital. The largest single component, direct cost of clinical
assessment, has increased sharply, because of increases in the number and size of
clinical trials.
2. Regulatory demands, chronic and complex indications, difficulty in recruiting and
retaining patients, and programs that are increasingly global. Even more
dismaying is the fact that despite considerably greater investments in research
efforts, the number of newer products is shrinking.
31
32. INTRODUCTION
There is thus little question of the urgent need to improve the efficiency of clinical
development, defined by the time and cost of getting a new drug to market.
Recognizing that only about one of every five candidates that enter clinical
testing will eventually make it to the marketplace, the means of improved
efficiency must lie in improved decision making, primarily in the form of the
ability to kill unpromising candidates early and speed the progress of those that
are promising.
Cutting development time by half is estimated to reduce the development costs
by 30%, and improving clinical success rates to one in three would reduce costs
by 27% .
32
33. INTRODUCTION
The key to improved decision making can only lie in improved data handling—earlier
decisions enabled by more data, of better quality, earlier in the process than is
currently possible.
Even a small change—killing an unpromising candidate even a few days earlier—will
substantially impact development costs.
This capability will increasingly be a key differentiator for any company developing
pharmaceutical products, from the largest multinational to the smallest biotech firm.
Large companies have dominated drug discovery in the past because of the enormous
resources required, but technology dramatically changed this by enabling even very small
companies to effectively compete through effective use of inexpensive technology.
33
34. TECHNOLOGY, PAST AND PRESENT
The basic processes that have been utilized in collecting and analyzing data over the
past few decades remain essentially unchanged.
A major bottleneck remains simply getting data into the system: The majority of data
are still recorded with a pen on a piece of paper, and then those figures are manually
entered into a computer.
Discrepancies are resolved mostly by faxing sites and manually entering corrections.
On a more strategic level, each step is completed before the next step is started: Each
query must be resolved before a database can be locked, analysis cannot start until a
database is locked, and decisions cannot be made before analysis is completed.
34
35. TECHNOLOGY, PAST AND PRESENT
Web-based electronic data collection (commonly called EDC systems, although this term actually
refers to any electronic system that handles data, regardless of how data are collected), which allows
some edit checks to be done in the field, is an example.
Even this tool has been hobbled by tradition: This technology still requires that data are recorded first
on a piece of paper, then transferred to a worksheet, and then manually entered by keyboard at the
site level.
The requirement for manual data entry also extracts a price—site personnel are clinicians (such as
nurses), which means they are not necessarily skilled at data entry. The result is that data entry is slow
and expensive.
In addition, because the task is onerous to many, the work simply gets delayed.
The importance of such pragmatic issues is reflected by the fact that only a fraction of clinical practices
participate in clinical trials more than once.
35
36. TECHNOLOGY, PAST AND PRESENT
The most important but currently absent component is the focus on decision making.
In pharmaceutical development, there is none that is more important than decision
making, on both a tactical as well as a strategic level.
Strategic decisions (those focusing on big-picture issues such as advancing to a
higher dose, weighing safety information against efficacy data, progressing from
one phase of development to the next) are built on a foundation of tactical decisions
(data and tight management of studies) decisions.
36
37. TACTICAL DECISION MAKING
Tactical decisions focus on two key elements, data and performance.
The data component is the traditional focus of study efficiency, as evidenced by the first
application of technology (even as ineffective as it has been) to this step.
Although it is clear that clean data are an essential element of any development
program, technology has for the most part not been applied to the effective management
of its collection.
Thus, although the current focus of technology is getting data into a database, it is
actually the performance measures that determine the speed and accuracy with which
data are collected.
37
38. TACTICAL DECISION MAKING
(EXAMPLE)
For example, slow response times at the site (reflected by measures such as interval
between data collection and entry and between query receipt and response) best
predict future data quality.
Systems that focus on a range of performance indices allow fundamental
management decisions about the magnitude and reasons for suboptimal
performance and allow measures to improve.
38
39. STRATEGIC DECISION MAKING
If the day-to-day management of studies can be considered tactical, the strategic use of
information is the real—and largely unrealized—promise of computers and
technology.
Assuming that the tactical side provides for a reliable stream of information, strategic
decisions can then be made as data accumulate rather than waiting until a project’s
completion. After the last patient visit, data need to be cleaned, then analyzed, then
acted upon, usually to design and finalize the next step of research. This process
generally takes weeks to months.
39
40. STRATEGIC DECISION MAKING
(EXAMPLE)
One is early rising-dose escalation studies that are generally conducted during phase I of a
product’s development.
These involve sequentially increasing administered dose to find the maximum tolerated dose
for subsequent testing in patients.
The industry currently conducts such studies by completing each dose level, analyzing the data
collected, and deciding about the next higher dose, a process that averages about six weeks.
However, technology allows this same six-week interval to be trimmed down to a matter of
days by careful planning and use of technology.
A patient group is dosed in the morning, data are collected in the afternoon and evening, and
they are summarized and analyzed that evening and posted to a website.
40
41. STRATEGIC DECISION MAKING (EXAMPLE)
The individuals involved with making the decision about whether to proceed to the next
higher dose level are able to examine and discuss the data regardless of where they are in
the world and what time it is there.
Once a decision is made, the next dosing can be administered the following day.
During a study, this same process can be repeated for other study areas. For example, a
typical phase II study might involve several dosing arms. These studies are designed at the
outset to determine a single, sometimes two, doses that will be used in the large, costly
phase III studies that serve as the basis of application to regulatory authorities to market a
drug.
41
42. STRATEGIC DECISION MAKING (EXAMPLE)
In contrast, current computational technology allows a steady stream of information that
builds as the study evolves.
When the study is half completed, indications of many aspects of drug safety and
performance become obvious.
At that point, the next study can be roughed out. This plan can be refined as the study
progresses and greater knowledge is gained.
If the study is not blinded, the need for this process is self-evident; if it is blinded, then
there is still much to be gained.
42
43. STRATEGIC DECISION MAKING (EXAMPLE)
A wealth of safety information is generated, key because safety often drives
development as well as being a general indication of drug efficacy.
For example, if there is a single treatment arm, efficacy can be indicated by
subtracting the expected performance of the comparison arm (placebo or standard
use therapy) from the pooled effect estimate. The same principle can be used to
guide the extensive preparation that precedes preparation of regulatory submissions
after Phase III studies.
43
44. SYSTEMS INTEGRATION
Decision making and management requires :
1. a means of quickly collecting both data and performance indicators on a wide
variety of measures
2. being able to quickly and automatically summarize those data, in different forms
for different functional responsibilities, and
3. providing a means of reacting to that information.
A system designed with these principles has been in use for more than a decade.
The system, which has been continually refined and expanded during the 15 years
since it was first used, involves a series of component modules designed to work
together and that can be rapidly and flexibly customized.
44
45. SYSTEMS INTEGRATION
The system currently includes:
Multiple options for data input, the most
important of which are machine read.
The two main options are Optical Mark Read
(bubble) forms, and SmartPen™, a special
pen with an optical sensor that records each
keystroke. Both utilize paper case report
forms, for which sites have indicated a
strong preference over the requirement to
enter data on a keyboard. 45
46. SYSTEMS INTEGRATION
The pen is docked at a computer or data can be wirelessly transmitted, and
data from anywhere in the world are immediately sent for validation.
Queries are generated within minutes, closing the feedback loop and
markedly reducing query rates as compared to conventional system.
An automated system that immediately validates and flags areas where
human intervention is required.
Patient randomization and tracking of study subjects through an electronic
master log.
46
47. SYSTEMS INTEGRATION
A online query management system that allows sites to receive queries within
minutes of data submission and resolution to be handled and documented
immediately.
Links to document management system for regulatory reports and
submissions.
A full range of reports reflecting both data and performance indices, available
over the web, that can be easily modified and supplemented.
47
49. 1) IMPROVING EFFICIENCY OF MONITORING
The most demanding requirement in the industry is the practice of field
monitoring, which requires highly trained individuals to visit sites to verify that
data are accurately recorded and that FDA regulations are being followed.
The traditional approach is that data collected at the site remain there until the
monitor comes by, normally every few weeks, to examine the data and bring them
back.
The monitoring process is being improved in two major ways:
First, sites are beginning to submit unmonitored data between monitor visits. This
is a major step for the industry but one that is eminently sensible in light of the
fact that computers do a far more reliable and comprehensive job of checking
data than individuals.
49
50. 1) IMPROVING EFFICIENCY OF MONITORING
Just as computer programmers learn that the quickest way to debug a program is
to allow debugging applications to handle the preliminary evaluations, the
industry is learning—slowly, some argue—that computers are also much better
at sorting through and highlighting areas of concern in data.
The second improvement comes from the potential that computers have, in
conjunction with newer processes, to dramatically reduce the cost and time
required to monitor data.
Most of monitoring involves comparing a source document, defined as the first
place a piece of data was recorded, with what was written.
50
51. 1) IMPROVING EFFICIENCY OF MONITORING
The advent of newer technology such as an optical pen (SmartPen™) means
that data can, for the first time, be recorded on a form that will be the source
document.
Indeed, it is electronically recorded as it is written.
This aspect alone could substantially impact both the cost and time required
for a development program.
51
52. 2) IMPROVED QUALITY & TIMELINESS OF DATA
A major complaint of clinical sites is the difficulty and time required for data entry
and the effort required to resolve queries, which are often returned weeks or even
months after a patient visit.
Machine-read data, whether collected by optical mark read or SmartPen™, ensure
that data are both entered and validated, with queries returned, in a matter of
minutes after they are recorded.
Coupled with a quick feedback loop, this system ensures that query rates are
typically about one-tenth those for web-based EDC systems and even lower for
paper-and hand entry systems.
Highlights recurring problems and areas of potential improvement that may impair
study timeliness and quality.
52
53. 3) SITE PERFORMANCE TRACKING
Close tracking of data entry means that performance measures can be
tracked and managed.
Such measures typically include query rates, time to respond to queries, rate
of query rejection, and the like, and they provide an opportunity to identify
and correct performance issues, whether associated with an individual site
or study or a program.
This capability alone provides a powerful tool by which sites throughout the
world can be closely monitored.
53
54. 4) THE CHANGING ROLE OF CLINICAL RESEARCH
ASSOCIATES
The availability of a range of performance measures and the ability to do many
more routine tasks by computer (notably those related to source verification)
mean that the Clinical research associate (CRA)’s role changes from box-checker
to manager.
The CRA, and a newer layer of submanagers who specialize in monitoring
performance data, can then focus on identifying and addressing performance
issues.
The traditional requirement to go to a site to fully understand what is occurring
is substantially reduced, and monitoring can be more effectively conducted
while spending less time traveling. 54
55. 5) PATIENT RECRUITMENT
The rate of recruitment is one of the most frequent factors limiting the speed of
clinical evaluations.
Newer systems, however, offer the possibility of evaluating performance on a
daily basis and the opportunity for midcourse corrections.
Tracking screen failures, enrollment and dropout rates failure all measures that
directly affect study duration and underlying assumptions about statistical power,
directly reflect study experience.
The ability to link performance metrics at each step allows successful strategies
to be differentiated from those that are less successful and to have the more
successful quickly shared and the less successful reduced or eliminated.
55
56. 5) PATIENT RECRUITMENT
As an example, use of this strategy has consistently allowed the
establishment of industry benchmarks in enrolling studies in diverse
geographical and therapeutic areas, including enrollment in studies of
breast cancer, vaginal microbiocides, and Alzheimer disease.
56
57. 6) MANAGEMENT OF GEOGRAPHICALLY
DIVERSE STUDIES
The confluence of communications and data processing technology
provides their greatest synergy in the management of complex, diverse
studies, especially those that may be additionally complicated by
globalization. Properly designed, the electronic systems described here
can operate with minimal intervention, providing a steady stream of data
and performance indices, especially those systems that allow data to be
machine read and at least partial validation to be automated.
57