Health
Informatics
BCA-2020: Semester-V
Module 4:
Chapter 1
DESIGNAND EVALUATION
OF INFORMATION
SYSTEMSAND SERVICES
QUALITY IMPROVEMENT
STRATEGIES
PROTOCOLSAND
EVIDENCE BASED
HEALTHCARE
Module
Content
 Design and Evaluation of Information Systems and Services:
principles of designing information systems, strategies for
Information system evaluation, Information Systems
Effectiveness Measures.
 Quality Improvement Strategies: quality improvement tools,
factors that help to create and sustain Healthcare Informatics as a
new field. quality improvement cycle: PDCA (Plan, Do, Check,Act)
Cycle.
 Protocols and Evidence based Healthcare: information
technology tools to support best practices in health care,
information technology tools that inform and empower patients.
LearningObjectives
 Principles of designing information systems
 Strategies for Information system evaluation
 Information Systems Effectiveness Measures
SYSTEM
DESIGN
 System design includes the organization of people, equipment, money and
procedures to process the information.
 System analysis and design draw heavily on theGeneral SystemsTheory as a
conceptual background.
 Given below are the general system theories and the importance of each one
in the content of information system design.
SYSTEM
DESIGN
Principles of a
Well-designed
System
 1) Principle of acceptability
 The success of a new system is highly dependent upon its acceptability by organizational
personnel or the persons for whom it is designed. For a successful system, the people who
use it should participate in its analysis, design and development.
 2) Principle of enhancing the decision-making process
 The new system should enhance the decision-making ability of organizational personnel.
This design approach allows more effective decisions.
 3) Principle of economy
 For economy in the new system, no information service should be provided that cannot be
cost-justified.
 4) Principle of flexibility
 The new system should be adaptable in a changing environment by allowing easy
expansion or contraction.
 5) Principle of reliability
 Reliability in a new system refers to consistency over an extended period of operation. A
high degree of reliability can be designed into the system by including good internal
controls.
 6) Principle of simplicity
 The simplicity of a system can be affected by providing a straight-line flow from one step to
the next, avoiding needless backtracking. Additionally, a simplified system is easier to
understand and use than a more complex system
Steps in
System Design
Process
 The key steps in the systems design process are:
 a) Review new system requirements
 Systems design, devising new system approaches, centers on
determining the requirements for a new system.This initial step in
systems design management takes into account the information
compiled to date on the present system.After system analysts have
reviewed appropriate date, they must specify the following:
 — New policies consistent with the organization objectives
 — Planned inputs
 — New methods and procedures
 — Data files to be maintained
 — Output needs
 — Internal control considerations
 — Equipment considerations
 The foregoing requirements for newly designed systems are not
complete until the human factors are considered.
Steps in
System Design
Process
 b) Design the new system
 It is recommended that a methodical approach to systems
design be undertaken initially during this critical phase.
 Recommended is the modular or building block approach
wherein major system functions are successfully separated
into distinct minor functions.
 When the functional analysis is complete, the systems
analyst creates a structure for the functional modules that is
capable of operating within whatever hardware constraints
are imposed.
 The net result of the modular approach is that duplicated
activities are eliminated and the complexity of the overall
systems is reduced.
Steps in
System Design
Process
 c) Flowchart and document the new system
 An important step is preparation of the final system flowcharts for
the recommended system, without specifying the equipment.
Accuracy, simplicity and ease of understanding are the essential
components since non-technical personnel may be reviewing and
evaluating them.
 d) Consider and review system design alternatives with proper personnel
 e) Select the more promising alternatives with the help of properly
experienced personnel
Steps in
System Design
Process
 f) Compare the tangible and intangible benefits of the promising alternatives.
Cost factors, volumes and requirements for equipment and personnel should
be carefully analysed to check the validity
 g) Select the system design from among the promising alternatives that best
meets the study’s requirements
 h) Prepare the final system specifications for the recommended systems
design. Relate the systems design to other appropriate parts of the
information system
 i) Document the final design.
SYSTEM
EVALUATION
 After successful implementation, it is necessary to complete the
exercise by going back to the specifications and assessing the extent
to which the system is meeting its stated objectives.
 Such an assessment may lead to improvements and refinements in
the way in which a system is used.
 The evaluation process also includes testing of the equipment as
systems analyst, while selecting equipment, must be aware of
modularity, compatibility, reliability, maintainability and vendor
support.
 Here, vendor support refers to availability of training facilities;
installation support; system development, conversion and testing
assistance; experience level and competence; availability of a user
group and availability of specialized software systems.
 Evaluation helps to assess whether the operational characteristics of
the sub-systems have been made compatible with interrelated
subsystem and with the overall system.While evaluating, the
effectiveness of the system can measured only after the following
have been accomplished:
 Identification of user requirements at all levels of system design
 Measurement of subsystem and subsystem performance at all levels of
operation.
Strategies for
Information
system
evaluation
 Evaluation of Information Systems
 The measurement of the efficiency of information systems is the
evaluation of information systems.The evaluation of information
systems depends on the amount of efficiency in workflow and
management an organization achieved by implementing an
information system. Evaluation of information systems is
measured based on a few criteria as follows:
 Dependency of Information
 Availability of Information
 Support in BusinessOperation
 Help to Survive in Competition
 Reduce Uncertainty
 Value and Cost of Information
 Presentation of Information
 Accuracy of Information
 Verification andValidation of Information
Strategies for
Information
system
evaluation
 Dependency of Information
 How much an organization can rely on the information it gets from
it’s information system is a criteria while evaluating the information system as
if the information system is not providing reliable information then there is
not mean to have it in the organization.
 Availability of Information
 Information systems are meant to deliver information when asked for. So if an
information system can’t deliver information in time, that if information is
unavailable at the moment when required, the whole system becomes
useless.
 Support in Business Operation
 An organization implements single or multiple information systems to get
support in every levels of the organization’s operations.Thus an efficient
information system is supposed to support business operations in each level.
 Help to Survive in Competition
 To survive and run successfully in the market an organization need to face and
win various challenges from other organization in the market. Information
systems are supposed to help organization in this survival by delivering
competitive reports.
Strategies for
Information
system
evaluation
 Reduce Uncertainty
 Information systems are supposed to deliver forecasts based data and current
market data to take business decision in a way that reduces uncertainty of the
business.
 Value and Cost of Information
 The cost of developing, implementing and running an information system must be
less than the value of the information it provides. If it is more costly to operate an
information system than what the information system gives in return then there is
no need to have such an information system.
 Presentation of Information
 Efficiency of an information system depends on the presentation of information by
the system.The information system that represents information in an attractive,
easy and more understandable format is obviously more efficient than others.
 Accuracy of Information
 Like reliability of the information achieved from an information system, accuracy
of the information is also important.
 Verification andValidation of Information
 If data were collected from multiple sources then information system must cross-
check the data to verify and validate it before using it in processing to generate
information.
Information
Systems
Effectiveness
Measures
 As Information Systems (IS) continue to be more and more complex
due to greater adoption and sophistication of planned usage, the
concerns surrounding the effectiveness of such software projects
become more and more critical.
 Further for the IT/ITeS Service Providers, it becomes more critical with
the growing understanding that just implementing progressive and
ambitious software projects within organizations may not necessarily
enhance organization productivity and lead to competitive
advantage that cannot be overtaken by the competitors.
 This is where a deeper understanding of the critical measures of
Information Systems Effectiveness comes so that the concerns may
be addressed at the grassroot level.
 Some of the key indicators of information system ineffectiveness:
 Excessive down time and idle time of implemented software projects
 Slow response time of project deliverables
 Excessive maintenance costs of systems
 Unreliable outputs across systems and deduplication of data
 Erroneous data across the different functional departments
 Excessive run costs during system uptime
 User dissatisfaction with output, content or timeliness
Information
Systems
Effectiveness
Measures
 The measures or criteria for Information Systems Effectiveness can be
broken down hierarchically as highlighted in the following diagram, to
address specific dimensions of utility from IS project
implementations.
Information
Systems
Effectiveness
Measures
 RELATIVE EVALUATION
 Task Accomplishment:An effective IS project is expected to
improve the task accomplishment of the system users.
 However it is important to note that providing specific measures
of past accomplishment that auditors can use for evaluation can
be difficult.
 Performance measures for task accomplishment differ across
applications and sometimes across organizations.
 For example, for a manufacturing control system might be
number of units output, number of defective units reworked, units
scrapped, amount of down time/idle time. Similarly, it is important
to trace task accomplishment over time. System may appear to
have improved for a short time after implementation, but fall into
disarray thereafter.
Information
Systems
Effectiveness
Measures
 RELATIVE EVALUATION
 Quality of Working Life: High quality of working life for users of a
system is a major objective in the design process.
 For example the virtual offices ensure better work-life balance.
However there are variations on the definition & measurement of
the concept of quality of working life.
 Different groups may have different vested interests : e.g.
productivity vs social interest like how to evade work deliverables
and engage in social behavior.
 Some of the major advantages of this measure is that it is
relatively objective, verifiable, and difficult to manipulate.
 Data required is relatively easy to obtain. However some of the
major disadvantages of this measure is that it is difficult to link
them to IS quality and difficult to pinpoint what corrective action
is needed.
Information
Systems
Effectiveness
Measures
 ABSOLUTE EVALUATION
 Operational Effectiveness:The Systems Auditor examines how well
an Information system meets its goals from the viewpoint of a user
who interacts with the system on a regular basis.
 Four measures used for estimating operational effectiveness:
 Frequency of use,
 Nature of use,
 Ease of use and
 User satisfaction.
 If a system is being frequently used, it is likely to be more effective
based on the perspective of its end user.
 Similarly a very sophisticated system may have analytics and insights
dashboards, but it may only be used to record transactions.
 This lower nature of usage by the end user would indicate lower
effectiveness of the implemented information systems.
 Ease of use and user satisfaction after usage are evaluated from the
perspective of the end user.This gives the auditor the additional
insight whether there is any productivity loss when the systems are
complex and difficult to be handled from the user
interface perspective.
Information
Systems
Effectiveness
Measures
 ABSOLUTE EVALUATION
 Technical Effectiveness: In this evaluation the focus of the auditor is
to evaluate whether the implemented Information System has the
appropriate hardware / software technology been used to support a
system, or, whether a change in the technology would enable the
system to meet its goals better.
 Hardware performance can be measured using hardware monitors or
more gross measures such as system response time, down time.
 Software effectiveness can be measured by examining the history of
program maintenance, modification and run time resource
consumption.
 The history of a software or hardware repair maintenance indicates
the quality of logic existing in a program. Extensive error correction
implies on a frequent basis implies inappropriate design, coding or
testing; failure to use structured approaches, etc.
 However there is a major challenge in using these measures. One
needs to understand that hardware & software are not independent
resources and the effectiveness audit needs to account for possible
overlaps.
Information
Systems
Effectiveness
Measures
 ABSOLUTE EVALUATION
 Economic Effectiveness: This requires the identification of costs
and benefits and the proper evaluation of costs and benefits.
 Sometimes, achieving it is difficult since costs and benefits
depend on the nature of the Information Systems Project.
 Not all costs and benefits can be tracked directly in many business
functions. Some of the benefits expected and derived from an IS
would be based on the contextual usage.
 For example, a system designed to support a social service
environment vis-à-vis a system designed to support
manufacturing activities would have different outcomes.
 Some of the most significant costs and benefits may be intangible
and difficult to identify, and next to impossible to value. So an
auditor needs to be careful while attempting measures to evaluate
a system based on these criteria.
Information
Systems
Effectiveness
Measures
 It is important to
note that beyond
these effectiveness
measures, there are
other dimensions
that auditors may
need to account for
while conducting an
information systems
audit. Efficiency
audits are equally
important for
information systems
projects and a
detailed
understanding may
be required while
conducting them
thoroughly.
ThankYou!

Health Informatics- Module 4-Chapter 1.pptx

  • 1.
  • 2.
    Module 4: Chapter 1 DESIGNANDEVALUATION OF INFORMATION SYSTEMSAND SERVICES QUALITY IMPROVEMENT STRATEGIES PROTOCOLSAND EVIDENCE BASED HEALTHCARE
  • 3.
    Module Content  Design andEvaluation of Information Systems and Services: principles of designing information systems, strategies for Information system evaluation, Information Systems Effectiveness Measures.  Quality Improvement Strategies: quality improvement tools, factors that help to create and sustain Healthcare Informatics as a new field. quality improvement cycle: PDCA (Plan, Do, Check,Act) Cycle.  Protocols and Evidence based Healthcare: information technology tools to support best practices in health care, information technology tools that inform and empower patients.
  • 4.
    LearningObjectives  Principles ofdesigning information systems  Strategies for Information system evaluation  Information Systems Effectiveness Measures
  • 5.
    SYSTEM DESIGN  System designincludes the organization of people, equipment, money and procedures to process the information.  System analysis and design draw heavily on theGeneral SystemsTheory as a conceptual background.  Given below are the general system theories and the importance of each one in the content of information system design.
  • 6.
  • 7.
    Principles of a Well-designed System 1) Principle of acceptability  The success of a new system is highly dependent upon its acceptability by organizational personnel or the persons for whom it is designed. For a successful system, the people who use it should participate in its analysis, design and development.  2) Principle of enhancing the decision-making process  The new system should enhance the decision-making ability of organizational personnel. This design approach allows more effective decisions.  3) Principle of economy  For economy in the new system, no information service should be provided that cannot be cost-justified.  4) Principle of flexibility  The new system should be adaptable in a changing environment by allowing easy expansion or contraction.  5) Principle of reliability  Reliability in a new system refers to consistency over an extended period of operation. A high degree of reliability can be designed into the system by including good internal controls.  6) Principle of simplicity  The simplicity of a system can be affected by providing a straight-line flow from one step to the next, avoiding needless backtracking. Additionally, a simplified system is easier to understand and use than a more complex system
  • 8.
    Steps in System Design Process The key steps in the systems design process are:  a) Review new system requirements  Systems design, devising new system approaches, centers on determining the requirements for a new system.This initial step in systems design management takes into account the information compiled to date on the present system.After system analysts have reviewed appropriate date, they must specify the following:  — New policies consistent with the organization objectives  — Planned inputs  — New methods and procedures  — Data files to be maintained  — Output needs  — Internal control considerations  — Equipment considerations  The foregoing requirements for newly designed systems are not complete until the human factors are considered.
  • 9.
    Steps in System Design Process b) Design the new system  It is recommended that a methodical approach to systems design be undertaken initially during this critical phase.  Recommended is the modular or building block approach wherein major system functions are successfully separated into distinct minor functions.  When the functional analysis is complete, the systems analyst creates a structure for the functional modules that is capable of operating within whatever hardware constraints are imposed.  The net result of the modular approach is that duplicated activities are eliminated and the complexity of the overall systems is reduced.
  • 10.
    Steps in System Design Process c) Flowchart and document the new system  An important step is preparation of the final system flowcharts for the recommended system, without specifying the equipment. Accuracy, simplicity and ease of understanding are the essential components since non-technical personnel may be reviewing and evaluating them.  d) Consider and review system design alternatives with proper personnel  e) Select the more promising alternatives with the help of properly experienced personnel
  • 11.
    Steps in System Design Process f) Compare the tangible and intangible benefits of the promising alternatives. Cost factors, volumes and requirements for equipment and personnel should be carefully analysed to check the validity  g) Select the system design from among the promising alternatives that best meets the study’s requirements  h) Prepare the final system specifications for the recommended systems design. Relate the systems design to other appropriate parts of the information system  i) Document the final design.
  • 12.
    SYSTEM EVALUATION  After successfulimplementation, it is necessary to complete the exercise by going back to the specifications and assessing the extent to which the system is meeting its stated objectives.  Such an assessment may lead to improvements and refinements in the way in which a system is used.  The evaluation process also includes testing of the equipment as systems analyst, while selecting equipment, must be aware of modularity, compatibility, reliability, maintainability and vendor support.  Here, vendor support refers to availability of training facilities; installation support; system development, conversion and testing assistance; experience level and competence; availability of a user group and availability of specialized software systems.  Evaluation helps to assess whether the operational characteristics of the sub-systems have been made compatible with interrelated subsystem and with the overall system.While evaluating, the effectiveness of the system can measured only after the following have been accomplished:  Identification of user requirements at all levels of system design  Measurement of subsystem and subsystem performance at all levels of operation.
  • 13.
    Strategies for Information system evaluation  Evaluationof Information Systems  The measurement of the efficiency of information systems is the evaluation of information systems.The evaluation of information systems depends on the amount of efficiency in workflow and management an organization achieved by implementing an information system. Evaluation of information systems is measured based on a few criteria as follows:  Dependency of Information  Availability of Information  Support in BusinessOperation  Help to Survive in Competition  Reduce Uncertainty  Value and Cost of Information  Presentation of Information  Accuracy of Information  Verification andValidation of Information
  • 14.
    Strategies for Information system evaluation  Dependencyof Information  How much an organization can rely on the information it gets from it’s information system is a criteria while evaluating the information system as if the information system is not providing reliable information then there is not mean to have it in the organization.  Availability of Information  Information systems are meant to deliver information when asked for. So if an information system can’t deliver information in time, that if information is unavailable at the moment when required, the whole system becomes useless.  Support in Business Operation  An organization implements single or multiple information systems to get support in every levels of the organization’s operations.Thus an efficient information system is supposed to support business operations in each level.  Help to Survive in Competition  To survive and run successfully in the market an organization need to face and win various challenges from other organization in the market. Information systems are supposed to help organization in this survival by delivering competitive reports.
  • 15.
    Strategies for Information system evaluation  ReduceUncertainty  Information systems are supposed to deliver forecasts based data and current market data to take business decision in a way that reduces uncertainty of the business.  Value and Cost of Information  The cost of developing, implementing and running an information system must be less than the value of the information it provides. If it is more costly to operate an information system than what the information system gives in return then there is no need to have such an information system.  Presentation of Information  Efficiency of an information system depends on the presentation of information by the system.The information system that represents information in an attractive, easy and more understandable format is obviously more efficient than others.  Accuracy of Information  Like reliability of the information achieved from an information system, accuracy of the information is also important.  Verification andValidation of Information  If data were collected from multiple sources then information system must cross- check the data to verify and validate it before using it in processing to generate information.
  • 16.
    Information Systems Effectiveness Measures  As InformationSystems (IS) continue to be more and more complex due to greater adoption and sophistication of planned usage, the concerns surrounding the effectiveness of such software projects become more and more critical.  Further for the IT/ITeS Service Providers, it becomes more critical with the growing understanding that just implementing progressive and ambitious software projects within organizations may not necessarily enhance organization productivity and lead to competitive advantage that cannot be overtaken by the competitors.  This is where a deeper understanding of the critical measures of Information Systems Effectiveness comes so that the concerns may be addressed at the grassroot level.  Some of the key indicators of information system ineffectiveness:  Excessive down time and idle time of implemented software projects  Slow response time of project deliverables  Excessive maintenance costs of systems  Unreliable outputs across systems and deduplication of data  Erroneous data across the different functional departments  Excessive run costs during system uptime  User dissatisfaction with output, content or timeliness
  • 17.
    Information Systems Effectiveness Measures  The measuresor criteria for Information Systems Effectiveness can be broken down hierarchically as highlighted in the following diagram, to address specific dimensions of utility from IS project implementations.
  • 18.
    Information Systems Effectiveness Measures  RELATIVE EVALUATION Task Accomplishment:An effective IS project is expected to improve the task accomplishment of the system users.  However it is important to note that providing specific measures of past accomplishment that auditors can use for evaluation can be difficult.  Performance measures for task accomplishment differ across applications and sometimes across organizations.  For example, for a manufacturing control system might be number of units output, number of defective units reworked, units scrapped, amount of down time/idle time. Similarly, it is important to trace task accomplishment over time. System may appear to have improved for a short time after implementation, but fall into disarray thereafter.
  • 19.
    Information Systems Effectiveness Measures  RELATIVE EVALUATION Quality of Working Life: High quality of working life for users of a system is a major objective in the design process.  For example the virtual offices ensure better work-life balance. However there are variations on the definition & measurement of the concept of quality of working life.  Different groups may have different vested interests : e.g. productivity vs social interest like how to evade work deliverables and engage in social behavior.  Some of the major advantages of this measure is that it is relatively objective, verifiable, and difficult to manipulate.  Data required is relatively easy to obtain. However some of the major disadvantages of this measure is that it is difficult to link them to IS quality and difficult to pinpoint what corrective action is needed.
  • 20.
    Information Systems Effectiveness Measures  ABSOLUTE EVALUATION Operational Effectiveness:The Systems Auditor examines how well an Information system meets its goals from the viewpoint of a user who interacts with the system on a regular basis.  Four measures used for estimating operational effectiveness:  Frequency of use,  Nature of use,  Ease of use and  User satisfaction.  If a system is being frequently used, it is likely to be more effective based on the perspective of its end user.  Similarly a very sophisticated system may have analytics and insights dashboards, but it may only be used to record transactions.  This lower nature of usage by the end user would indicate lower effectiveness of the implemented information systems.  Ease of use and user satisfaction after usage are evaluated from the perspective of the end user.This gives the auditor the additional insight whether there is any productivity loss when the systems are complex and difficult to be handled from the user interface perspective.
  • 21.
    Information Systems Effectiveness Measures  ABSOLUTE EVALUATION Technical Effectiveness: In this evaluation the focus of the auditor is to evaluate whether the implemented Information System has the appropriate hardware / software technology been used to support a system, or, whether a change in the technology would enable the system to meet its goals better.  Hardware performance can be measured using hardware monitors or more gross measures such as system response time, down time.  Software effectiveness can be measured by examining the history of program maintenance, modification and run time resource consumption.  The history of a software or hardware repair maintenance indicates the quality of logic existing in a program. Extensive error correction implies on a frequent basis implies inappropriate design, coding or testing; failure to use structured approaches, etc.  However there is a major challenge in using these measures. One needs to understand that hardware & software are not independent resources and the effectiveness audit needs to account for possible overlaps.
  • 22.
    Information Systems Effectiveness Measures  ABSOLUTE EVALUATION Economic Effectiveness: This requires the identification of costs and benefits and the proper evaluation of costs and benefits.  Sometimes, achieving it is difficult since costs and benefits depend on the nature of the Information Systems Project.  Not all costs and benefits can be tracked directly in many business functions. Some of the benefits expected and derived from an IS would be based on the contextual usage.  For example, a system designed to support a social service environment vis-à-vis a system designed to support manufacturing activities would have different outcomes.  Some of the most significant costs and benefits may be intangible and difficult to identify, and next to impossible to value. So an auditor needs to be careful while attempting measures to evaluate a system based on these criteria.
  • 23.
    Information Systems Effectiveness Measures  It isimportant to note that beyond these effectiveness measures, there are other dimensions that auditors may need to account for while conducting an information systems audit. Efficiency audits are equally important for information systems projects and a detailed understanding may be required while conducting them thoroughly.
  • 24.