Presented at the 11th Healthcare CIO Certificate Program, School of Hospital Management, Faculty of Medicine Ramathibodi Hospital, Mahidol University, Bangkok, Thailand on February 17, 2021
Researcher KnowHow session 1 of 3 presented by Ruaraidh Hill PhD MSc FHEA Lecturer in evidence synthesis and Michelle Maden PhD MAFHEA Postdoc research associate in evidence synthesis at the University of Liverpool on 22nd November 2021.
Presented at the 11th Healthcare CIO Certificate Program, School of Hospital Management, Faculty of Medicine Ramathibodi Hospital, Mahidol University, Bangkok, Thailand on February 17, 2021
Researcher KnowHow session 1 of 3 presented by Ruaraidh Hill PhD MSc FHEA Lecturer in evidence synthesis and Michelle Maden PhD MAFHEA Postdoc research associate in evidence synthesis at the University of Liverpool on 22nd November 2021.
A well recognised form of research is called systematic reviews on specific point. Why do we need them and How they can be done?? this talk is trying to answer these questions in a simple way
For a School of Information class on medical librarianship, this presentation was created to provide a very basic introduction and overview of the concepts, expectations, and experience of the librarian portion of working in a systematic review team.
This workshop is meant to be an introduction to the systematic review process. Further information about systematic reviews was available through a research guide. http://libguides.ucalgary.ca/content.php?pid=593664
A well recognised form of research is called systematic reviews on specific point. Why do we need them and How they can be done?? this talk is trying to answer these questions in a simple way
For a School of Information class on medical librarianship, this presentation was created to provide a very basic introduction and overview of the concepts, expectations, and experience of the librarian portion of working in a systematic review team.
This workshop is meant to be an introduction to the systematic review process. Further information about systematic reviews was available through a research guide. http://libguides.ucalgary.ca/content.php?pid=593664
Measuring the impact of integrated systems research
Panel Speakers: Vincent Gitz, Natalia Estrada Estrada Carmona, Monica Biradavolu and Karl Hughes. Measuring the Impact of Integrated Systems Research (September 27, 2021 – September 30, 2021). Three-day virtual workshop co hosted by the CGIAR Research Programs on Water Land and Ecosystems (WLE); Forests, Trees and Agroforestry (FTA); Policies, Institutions, and Markets (PIM); and SPIA, the Standing Panel on Impact Assessment of the CGIAR. The workshop took stock of existing and new methodological developments of monitoring, evaluation and impact assessment work, and discussed which are suitable to evaluate and assess complex, integrated systems research.
Vast amounts of survey data are collected for many purposes, including governmental information, public opinion and election surveys, advertising and market research as well as scientific research
Survey data underlie many public policy and business decisions
Good quality data reduces the risk of poor policies and decisions and is of crucial importance
Jim Warren
National Institute for Health Innovation, University of Auckland
(Friday, 3.00, General 3)
Provides background and overview of a Health IT Evaluation Framework that has been developed to support the National Health IT Plan and New Zealand health innovation generally. The framework recommends a pragmatic approach that includes use of both quantitative data (particularly data based on the transactional logs of operational IT systems), and qualitative data systematically gathered through stakeholder interviews. An Action Research orientation is recommended where the evaluators actively seek to understand barriers and find pointers to potential solutions. The investigation protocol is recommended to be iterative and flexible, and to involve dissemination of intermediate findings for feedback and broad dissemination of final results. Moreover, evaluation should be integrated with implementation, rather than a standalone post implementation activity. No single type of measurement should dominate the evaluation, which should employ a measurement framework including work and communication patterns, organisational culture, safety, effectiveness, system integrity and usability, as well as vendor factors, project management, participant experience and governance.
Use of Qualitative Approaches for Impact Assessments of Integrated Systems Research: Our Experience - Monica Biradavolu, SPIA. Measuring the Impact of Integrated Systems Research (September 27, 2021 – September 30, 2021). Three-day virtual workshop co hosted by the CGIAR Research Programs on Water Land and Ecosystems (WLE); Forests, Trees and Agroforestry (FTA); Policies, Institutions, and Markets (PIM); and SPIA, the Standing Panel on Impact Assessment of the CGIAR. The workshop took stock of existing and new methodological developments of monitoring, evaluation and impact assessment work, and discussed which are suitable to evaluate and assess complex, integrated systems research.
Similar to Evaluation of Health IT Implementation (March 20, 2019) (20)
Presented at the BDMS Golden Jubilee Scientific Conference 2022 "BDMS Beyond 50 years: Looking towards the centennial," Bangkok Dusit Medical Services Public Company Limited (BDMS), Bangkok, Thailand on October 19, 2022
Presented at The Thai Medical Informatics Association Annual Conference and The National Conference on Medical Informatics (TMI-NCMedInfo) 2021, Bangkok, Thailand on November 26, 2021
Presented at the Master of Science Program in Medical Epidemiology and the Doctor of Philosophy Program in Clinical Epidemiology, Department of Clinical Epidemiology and Biostatistics, Faculty of Medicine Ramathibodi Hospital, Mahidol University, Bangkok, Thailand on November 25, 2021
Presented at the Master of Science and Doctor of Philosophy Programs in Data Science for Healthcare and Clinical Informatics, Department of Clinical Epidemiology and Biostatistics, Faculty of Medicine Ramathibodi Hospital, Mahidol University, Bangkok, Thailand on November 15, 2021
Consumer Health Informatics, Mobile Health, and Social Media for Health: Part...Nawanan Theera-Ampornpunt
Presented at the Master of Science and Doctor of Philosophy Programs in Data Science for Healthcare and Clinical Informatics, Department of Clinical Epidemiology and Biostatistics, Faculty of Medicine Ramathibodi Hospital, Mahidol University, Bangkok, Thailand on November 10, 2021
Oprah Winfrey: A Leader in Media, Philanthropy, and Empowerment | CIO Women M...CIOWomenMagazine
This person is none other than Oprah Winfrey, a highly influential figure whose impact extends beyond television. This article will delve into the remarkable life and lasting legacy of Oprah. Her story serves as a reminder of the importance of perseverance, compassion, and firm determination.
The Team Member and Guest Experience - Lead and Take Care of your restaurant team. They are the people closest to and delivering Hospitality to your paying Guests!
Make the call, and we can assist you.
408-784-7371
Foodservice Consulting + Design
Modern Database Management 12th Global Edition by Hoffer solution manual.docxssuserf63bd7
https://qidiantiku.com/solution-manual-for-modern-database-management-12th-global-edition-by-hoffer.shtml
name:Solution manual for Modern Database Management 12th Global Edition by Hoffer
Edition:12th Global Edition
author:by Hoffer
ISBN:ISBN 10: 0133544613 / ISBN 13: 9780133544619
type:solution manual
format:word/zip
All chapter include
Focusing on what leading database practitioners say are the most important aspects to database development, Modern Database Management presents sound pedagogy, and topics that are critical for the practical success of database professionals. The 12th Edition further facilitates learning with illustrations that clarify important concepts and new media resources that make some of the more challenging material more engaging. Also included are general updates and expanded material in the areas undergoing rapid change due to improved managerial practices, database design tools and methodologies, and database technology.
Artificial intelligence (AI) offers new opportunities to radically reinvent the way we do business. This study explores how CEOs and top decision makers around the world are responding to the transformative potential of AI.
3. Why Evaluate Projects?
• Promotional: To encourage more use
• Scholarly: To confirm or create scientific
knowledge
• Pragmatic: To know what works and what fails
• Ethical: To ensure appropriateness & justify its
use or its budget
• Medicolegal: To reduce liability risks
Friedman & Wyatt (2006)
4. Complexity of Evaluation in Informatics
Friedman & Wyatt (2006)
Medicine &
Health care
Evaluation
Methodology
Information
Systems
7. Class Exercise 1
• How would you evaluate the success
of your project to implement
Computerized Physician Order Entry
(CPOE) in your hospital?
– What defines success
– Measurement methods
8. • DeLone & McLean’s IS Success Model (1992;2003)
• Revised model in 2003 adds “Service Quality”
Various Ways to Measure Success
DeLone & McLean (1992; 2003)
9. Health IT as Healthcare Interventions
• Donabedian’s Model
Donabedian (1966), Friedman & Wyatt (2006)
Structure Processes Outcomes
10. Class Exercise 2
• Can you provide some examples of
measures in each aspect in the
Donabedian’s model that help evaluate
health IT project success?
11. A Mindset for Evaluation
• Tailor the study to the problem
• Collect data useful for making decisions
• Look for intended and unintended effects
• Study the resource while it is under development and
after it is deployed
• Study the resource in the lab and in the field
• Go beyond the developer’s point of view
• Take the environment into account
• Let the key issues emerge over time
• Be methodologically Catholic and eclectic
Friedman & Wyatt (2006)
12. Evaluation vs. Traditional Research
• Different goals
• Who (clients or evaluators) determines the agenda
• Evaluation actively seeks unanticipated effects as well
as anticipated ones
• Both lab and in-situ evaluations important for evaluation
• Evaluations often employ many data-collection paradigm
Friedman & Wyatt (2006)
13. Evaluation Approaches
• Objectivist vs. Subjectivist approaches
• Objectivist characteristics
– Information resources, users, and processes can be measured
– Rational persons should agree on important measures and
desirable outcomes
– It is possible to disprove a hypothesis, but never to fully prove
one
– Quantitative measurement is superior and more precise to
qualitative methods
– We can assess which resource is superior through comparisons
Friedman & Wyatt (2006)
14. Evaluation Approaches
• Objectivist vs. Subjectivist approaches
• Subjectivist characteristics
– What is observed depends fundamentally on the observer
– Context is crucial
– Different perspectives can be legitimately valid on desirable
outcomes
– Verbal description can be highly illuminating
– Evaluation is viewed as an exercise in argument, rather than
demonstration
Friedman & Wyatt (2006)
15. Objectivist Approaches
Objectivist
• Comparison-Based Approach
• Objectives-Based Approach (against stated goals)
• Decision-Facilitation Approach (evaluation to resolve
issues important for decision-making for further
development)
• Goal-Free Approach (purposefully blinded to intended
effects)
Friedman & Wyatt (2006)
16. Subjectivist Approaches
Subjectivist
• Quasi-Legal Approach (e.g. a mock trial)
• Art Criticism Approach
• Professional Review Approach (e.g. site visit by
experienced peers)
• Responsive/Illuminative Approach (derived from
ethnography)
Friedman & Wyatt (2006)
17. Objectivist Studies
• Measurement studies
– “Studies undertaken to develop and refine methods for making
measurements”
– E.g. development and validation of measurement methods,
tools, questionnaires
• Demonstration studies
– Studies that use measurement “methods to address questions of
direct importance in informatics”
– Descriptive studies (no independent variables)
– Comparative studies (investigator creates a contrasting set of
conditions, as in experiments & quasi-experiments)
– Correlational studies (explore hypothesized relationships among
variables that were not manipulated)
Friedman & Wyatt (2006)
18. Study Designs
• Experiments
– Randomized controlled trials
• Quasi-Experiments
– Non-randomized interventions
– Investigator still controls assignment of subjects to
interventions but not through randomization
• Observational Studies
– Investigator has no control over assignment of
subjects into groups
Friedman & Wyatt (2006)
23. Observational Studies
• Cohort studies
– Observe subjects with different exposures over time and
compare outcomes
• Case-control studies
– Compare subjects with outcome of interests (cases) and without
(controls) retrospectively to determine differences in exposure
• Cross-sectional studies
Mann (2003)
28. Threats to Internal Validity: Biases
• Assessment bias
• Allocation and recruitment bias
• The Hawthorne Effect (the tendency for humans to
improve their performance if they know it is being
studied)
• Data collection biases
– Checklist effect
– Data completeness effect (more complete data in intervention cases
than controls)
– Feedback effect
– Carryover effect (spillover effect)
– Placebo effect
– Second-look bias
Friedman & Wyatt (2006)
31. Threats to External Validity
• Study generalizability
– Sample representativeness
– Intervention (including implementation strategies)
– Context
• Developers as evaluators
Friedman & Wyatt (2006)
32. Making Conclusions
• Internal and external validity
• Correlation vs. causation
• Acknowledgement of study limitations
• Anticipated vs. unanticipated effects
• Lessons learned
33. Special Study Methods Used in Informatics
• Surveys
– Study design: Cross-sectional vs. longitudinal
– Subjects
– Sampling methods
• Census
• Random sampling (simple, stratified, cluster)
• Nonproblability sampling (purposive sampling,
quota sampling, etc.)
– Sampling frame
37. Special Study Methods Used in Informatics
• Time and Motion Studies
(Time-Motion Studies)
• Economic Analysis
– Cost-effectiveness analysis
– Cost-benefit analysis
– Cost-utility analysis
– Economic impact analysis
– Return on investment analysis
38. Special Study Methods Used in Informatics
• Qualitative Studies
– Interviews
– Focused groups
– Usability evaluations
– Content analysis
39. Special Study Methods Used in Informatics
• Software Testing & Evaluation
Methodology
• Testing Levels
– Unit testing
– Integration testing
– System testing
– System integration testing
http://en.wikipedia.org/wiki/Software_testing
42. Image source: Senoo et al. (2007)
Nonaka SECI Model
During
Implementation,
Near Go-Live &
Post Go-Live
After Action
Review (AAR) /
Postmortem
Meeting,
Project Evaluation
Before & After
Project Kick-off,
During Project
Planning
During
Implementation,
Near Go-Live
Training
Project Evaluation as Part of Project’s KM
43. “Half the money I spend on
advertising is wasted; the trouble is
I don't know which half.”
-- John Wanamaker
http://www.quotationspage.com/quote/1992.html, http://en.wikipedia.org/wiki/John_Wanamaker
44. References
• DeLone WH, McLean ER. Information systems success: the quest for the
dependent variable. Inform Syst Res. 1992 Mar;3(1):60-95.
• DeLone WH, McLean ER. The DeLone and McLean model of information
systems success: a ten-year update. J Manage Inform Syst. 2003
Spring;19(4):9-30.
• Donabedian A. Evaluating the quality of medical care. Millbank Mem Q.
1966;44:166-206.
• Friedman CP, Wyatt JC. Evaluation methods in biomedical informatics. 2nd
ed. New York (NY): Springer; 2006. 386 p.
• Harris AD, McGregor JC, Perencevich EN, Furuno JP, Zhu J, Peterson DE,
Finkelstein J. The use and interpretation of quasi-experimental studies in
medical informatics. J Am Med Inform Assoc. 2006 Jan-Feb;13(1):16-23.
45. References
• Mann CJ. Observational research methods. Research design II: cohort,
corss sectional, and case-control studies. Emerg Med J. 2003;20:54-60.
• Office of Management and Budget, Office of Information and Regulatory
Affairs, Statistical Policy Office. Statistical policy working paper 31:
Measuring and reporting sources of error in surveys. 2001 Jul.