Having systematic questionnaire design and testing procedures in place is vital for data quality,
particularly for a minimisation of the measurement error.
A presentation on validity and reliability assessment of questionnaire in research. Also includes types of validity and reliability and steps in achieving the same.
A presentation on validity and reliability assessment of questionnaire in research. Also includes types of validity and reliability and steps in achieving the same.
This presentation describes the steps in designing a questionnaire. Also includes video clips for the process in evaluating the questionnaires for its reliability analysis.
“Focus group interviews typically have five characteristics or features: (a) people, who (b) possess certain characteristics, (c) provide data (d) of a qualitative nature (e) in a focused discussion.”
-Focus Groups: A Practical Guide for Applied Research (Krueger)
An introduction to using questionnaires, brief outline of what makes a good questionnaire, and things to think about when designing, distributing and interpreting your questionnaire.
Occupational health in social responsibility standardsAhmed-Refat Refat
Health and safety at work is an essential component of Social Responsibility and
companies cannot be responsible externally,
while having poor ethical performance internally
This presentation describes the steps in designing a questionnaire. Also includes video clips for the process in evaluating the questionnaires for its reliability analysis.
“Focus group interviews typically have five characteristics or features: (a) people, who (b) possess certain characteristics, (c) provide data (d) of a qualitative nature (e) in a focused discussion.”
-Focus Groups: A Practical Guide for Applied Research (Krueger)
An introduction to using questionnaires, brief outline of what makes a good questionnaire, and things to think about when designing, distributing and interpreting your questionnaire.
Occupational health in social responsibility standardsAhmed-Refat Refat
Health and safety at work is an essential component of Social Responsibility and
companies cannot be responsible externally,
while having poor ethical performance internally
Predatory Publishers are primarily publish online journals which have little or no academic legality.
They exist solely to make money for their owners, and they make that money by charging excessive “article processing fees”.
There is minimal to no peer-review of published articles, despite their claims.
The scholarship of these journals is not reliable.
They aggressively solicit new articles which they publish, for a price.
The ILO celebrates the World Day for Safety and Health at Work on the 28 April to promote the prevention of occupational accidents and diseases globally.
It is an awareness-raising campaign intended to focus international attention on emerging trends in the field of occupational safety and health and on the magnitude of work-related injuries, diseases and fatalities worldwide
Increased policy processes in environmental protection, public and environmental health
require reliable information to support decision-making in identifying priorities and track
action progress. Indicators are this important link in the data-to-decision-making chain:
measurements produce raw data; data are aggregated and summarized to provide statistics;
statistics are analysed and re-expressed in the form of indicators; and indicators are then fed
into the decision-making process [
By examining the recent trends of scientific publishing, it is expected over the next five years that all medical journals will be “ open access” journals , and will not go another five years, only to fade away forever all kinds of journals and publishing houses , to be replaced by digital repositories of universities ..!!! Prof. Ahmed Refat AG Refat
Corporate social responsibilty and occupational healthAhmed-Refat Refat
Corporate responsibility is the commitment of businesses to contribute to sustainable economic development by working with employees, their families, the local community and society at large to improve their lives in ways that are good for business and for development
Environmental Health Indicator is : An expression of the link between environment and health, targeted at an issue of specific policy or management concern and presented in a form, which facilitates interpretation for effective decision ”
Many Models as PSR , DSR, DPSIR, & DPSEEA models
Every 15 seconds,
a worker dies from a work-related accident or disease.
Every 15 seconds,
160 workers have a work-related accident.
Workplaces claim more than 2.3 million deaths per year,
out of which 350,000 are fatal accidents
and close to 2 million are work-related diseases
SMART Learner FormPSL Scientific Merit Action Research Te.docxrosemariebrayshaw
SMART: Learner Form
PSL Scientific Merit Action Research Template (SMART) Form (Research Plan)Scientific Merit Process
Learners who are doing action research for their dissertation will use this form to go through the process of scientific merit review. The goals of this process are: (1) to facilitate the planning of the details of your action research project, (2) to ensure that the proposed project has rigor and allows for scientific merit review, and (3) to facilitate your progress through the dissertation. This is not an addition to your dissertation but a step to assist you in obtaining mentor, committee, school, and IRB approval more efficiently. You must obtain mentor, committee, and school approval of your research plan before submitting your IRB application.Scientific Merit Criteria The following criteria will be used to establish scientific merit. The purpose of the review will determine if the proposed project: 1. Contributes to society by improving a practice.2. Documents need for change by utilizing evidence-based needs assessment.3. Meets certain “hallmarks” of a good action research project including:· Action research design:· Practical.· Participatory.· Defined action plan.Scientific Merit ApprovalYour completed SMART form will be approved, not approved, or deferred for major or minor revisions. Your committee will use a checklist to determine if the study meets the criteria for scientific merit and the committee will provide specific feedback designed to identify any issues related to the scientific merit that must be resolved. You will have up to three opportunities to submit this form for committee approval.
Obtaining scientific merit approval does not guarantee you will obtain IRB approval. The IRB review will focus on ethical issues. A detailed ethical review will be conducted during the process of IRB approval.Recommendations for How to Use This FormThe SMART form is intended to help you and your mentor plan the design and details of your dissertation. Once your mentor approves your SMART form, your entire committee will review the form for scientific merit. After the entire committee approves your SMART form, it will be submitted for school approval. It is recommended that you use this form in a step-by-step way to help plan your design. Expect that you will go through a few revisions before your mentor and committee approve this form.
Tips for filling out the SMART form:
· Prepare your answers in a separate Word document for ease of editing and revision.
· Copy and paste items into the right-hand fields when they are ready.
· Retain the descriptions in the left column.
· Keep the form unlocked for ongoing editing and revision.
· Leave no blank spaces in the form. If an item does not apply to your study, type “NA” in its field.
· Read the item descriptions carefully. Items request very specific information. Be sure you understand what is asked (Good practice for your IRB application!).
· Use primary sources to the .
Developing an evaluation strategy to gain insights into the ROER4D multi-nati...SarahG_SS
Presentation at the OE Global Conference held in Banff, Canada in April 2015. This presentation introduces the development of the evaluation strategy, using Utilization Focused Evaluation (UFE), for the ROER4D Project (http://roer4d.org/).
You can access the abstract here: http://conference.oeconsortium.org/2015/presentation/developing-an-evaluation-strategy-to-gain-insights-into-a-multi-national-project-roer4d/
7 Components to Medical Device Usability Testing SuccessMargee Moore
Despite the publication of various relevant guidance on medical device usability testing and standards for human factors testing confusion regarding best practices still exists. This presentation provides clear language and seven components for planning successful usability testing for medical device development.
Despite the publication of various relevant guidance on medical device usability testing and standards for human factors testing confusion regarding best practices still exists. This presentation outlines simple components for successful usability testing in medical device development. Also provides info on formative vs. validation testing. Visit www.kascope.com for more info on our medical device development services.
SUBJECT: SOCIO EDUCATIONAL PROJECT
TUTOR: DR. MIGUEL PONCE
THEME: PART III. PROJECT EVALUATION
STUDENTS:
MONTESDEOCA BENITEZ DIANA PRISCILA
PACHACAMA SIMBAÑA DAYSI ALEXANDRA
Workplace wellbeing relates to all aspects of working life, from the quality and safety of the physical environment, to how workers feel about their work, their working environment, the climate at work and work organization.
Workers wellbeing is a key factor in determining an organization’s longterm effectiveness
law is a body of norms
(or rules of conduct) of binding force and effect, specified
and enforced by a recognised authority. Law is used to
create rights and duties, which should be applied fairly
and consistently throughout society
PUBLIC HEALTH POLICY & LEGISLATIONS Health is the right of all persons and the duty of the State and is guaranteed by means of social and economic policies aimed at reducing the risk of illness and other hazards and at universal and equal access to all actions and services for the promotion, protection and recovery of health.
After completing this module you will able to..
1. Describe the access tools available to you for finding information
2. Identify effective search techniques
3. Describe the characteristics of Internet search engines , subject directory and databases.
4. Identify a range of information sources
5. Consider which sources are most likely to be useful for your search question
6. Understand why some information sources may be more helpful than others in the context of a particular information need.
After completing this module you will able to..
1. Analyze a research topic
2. Develop appropriate search strategies and conduct a search
3. Refine search results
Safalta Digital marketing institute in Noida, provide complete applications that encompass a huge range of virtual advertising and marketing additives, which includes search engine optimization, virtual communication advertising, pay-per-click on marketing, content material advertising, internet analytics, and greater. These university courses are designed for students who possess a comprehensive understanding of virtual marketing strategies and attributes.Safalta Digital Marketing Institute in Noida is a first choice for young individuals or students who are looking to start their careers in the field of digital advertising. The institute gives specialized courses designed and certification.
for beginners, providing thorough training in areas such as SEO, digital communication marketing, and PPC training in Noida. After finishing the program, students receive the certifications recognised by top different universitie, setting a strong foundation for a successful career in digital marketing.
June 3, 2024 Anti-Semitism Letter Sent to MIT President Kornbluth and MIT Cor...Levi Shapiro
Letter from the Congress of the United States regarding Anti-Semitism sent June 3rd to MIT President Sally Kornbluth, MIT Corp Chair, Mark Gorenberg
Dear Dr. Kornbluth and Mr. Gorenberg,
The US House of Representatives is deeply concerned by ongoing and pervasive acts of antisemitic
harassment and intimidation at the Massachusetts Institute of Technology (MIT). Failing to act decisively to ensure a safe learning environment for all students would be a grave dereliction of your responsibilities as President of MIT and Chair of the MIT Corporation.
This Congress will not stand idly by and allow an environment hostile to Jewish students to persist. The House believes that your institution is in violation of Title VI of the Civil Rights Act, and the inability or
unwillingness to rectify this violation through action requires accountability.
Postsecondary education is a unique opportunity for students to learn and have their ideas and beliefs challenged. However, universities receiving hundreds of millions of federal funds annually have denied
students that opportunity and have been hijacked to become venues for the promotion of terrorism, antisemitic harassment and intimidation, unlawful encampments, and in some cases, assaults and riots.
The House of Representatives will not countenance the use of federal funds to indoctrinate students into hateful, antisemitic, anti-American supporters of terrorism. Investigations into campus antisemitism by the Committee on Education and the Workforce and the Committee on Ways and Means have been expanded into a Congress-wide probe across all relevant jurisdictions to address this national crisis. The undersigned Committees will conduct oversight into the use of federal funds at MIT and its learning environment under authorities granted to each Committee.
• The Committee on Education and the Workforce has been investigating your institution since December 7, 2023. The Committee has broad jurisdiction over postsecondary education, including its compliance with Title VI of the Civil Rights Act, campus safety concerns over disruptions to the learning environment, and the awarding of federal student aid under the Higher Education Act.
• The Committee on Oversight and Accountability is investigating the sources of funding and other support flowing to groups espousing pro-Hamas propaganda and engaged in antisemitic harassment and intimidation of students. The Committee on Oversight and Accountability is the principal oversight committee of the US House of Representatives and has broad authority to investigate “any matter” at “any time” under House Rule X.
• The Committee on Ways and Means has been investigating several universities since November 15, 2023, when the Committee held a hearing entitled From Ivory Towers to Dark Corners: Investigating the Nexus Between Antisemitism, Tax-Exempt Universities, and Terror Financing. The Committee followed the hearing with letters to those institutions on January 10, 202
Model Attribute Check Company Auto PropertyCeline George
In Odoo, the multi-company feature allows you to manage multiple companies within a single Odoo database instance. Each company can have its own configurations while still sharing common resources such as products, customers, and suppliers.
Read| The latest issue of The Challenger is here! We are thrilled to announce that our school paper has qualified for the NATIONAL SCHOOLS PRESS CONFERENCE (NSPC) 2024. Thank you for your unwavering support and trust. Dive into the stories that made us stand out!
Embracing GenAI - A Strategic ImperativePeter Windle
Artificial Intelligence (AI) technologies such as Generative AI, Image Generators and Large Language Models have had a dramatic impact on teaching, learning and assessment over the past 18 months. The most immediate threat AI posed was to Academic Integrity with Higher Education Institutes (HEIs) focusing their efforts on combating the use of GenAI in assessment. Guidelines were developed for staff and students, policies put in place too. Innovative educators have forged paths in the use of Generative AI for teaching, learning and assessments leading to pockets of transformation springing up across HEIs, often with little or no top-down guidance, support or direction.
This Gasta posits a strategic approach to integrating AI into HEIs to prepare staff, students and the curriculum for an evolving world and workplace. We will highlight the advantages of working with these technologies beyond the realm of teaching, learning and assessment by considering prompt engineering skills, industry impact, curriculum changes, and the need for staff upskilling. In contrast, not engaging strategically with Generative AI poses risks, including falling behind peers, missed opportunities and failing to ensure our graduates remain employable. The rapid evolution of AI technologies necessitates a proactive and strategic approach if we are to remain relevant.
2. Contents
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
2
Main Concepts & terms ( Pre-test)
Sequence of questionnaire development & testing
Questionnaire Testing Methods
Errors Detection
Questionnaire Evaluation Criteria
Criteria for a Good Questionnaire
Using Questionnaire Appraisal System (QAS)-99
3. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
3
Main Concepts
& Terms
( Pre-test :10 terms)
Define the underlined terms in the following slide…….
4. It has to be established that the Qes. is valid and reproducible in the context in which it is going to be employed.
The validity of the questionnaire should have been investigated;
The questionnaire are a correct and comprehensive reflection of the concept the questionnaire is intended to measure (content validity).
the dimensionality (factor structure) and internal consistency of the questionnaire should have been investigated;
the test-retest reliability of the questionnaire should have been investigated;
Responsiveness will need to be determined for questionnaires being to be used to measure changes within individuals .
Information regarding the interpretation of questionnaire scores should be available
Floor effects or ceiling effects less than 15%
Accepted score 7+
December 12, 2014
4
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
7. Recommended Source-3
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
7
Questionnaire Appraisal System
(QAS-99)
QAS-99 is based on a system that developed for Behavioral Surveillance Branch of the Centers for Disease Control and Prevention for use in evaluating questions for the Behavioral Risk Factor Surveillance System (BRFSS).
http://www.cdc.gov/nccdphp/brfss
8. Questionnaire development approach
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
8
I. Determine Analytic Objectives
• What types of data will answer the research question?
II. Develop general concepts to be covered
• List areas to be covered by questions
III. Translate concepts into questions
IV. “Appraise” questions for common pitfalls
V. Evaluate questions empirically
10. Sequence of questionnaire development & testing
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
10
11. 1- Conceptualization
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
11
Conceptual frame of the Questionnaire is more important for new surveys, whereas in existing surveys concepts might be already well established.
An integral part in every change in a questionnaire.
The main output are :
An entities/relationships scheme,
An area tree about the structure of the targeted questionnaire,
List of target variables
12. 2- Questionnaire design.
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
12
First draft of a questionnaire
Appropriate wording,
Order of questions and
Definition of answering categories
13. 3- Testing.
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
13
The questionnaire needs to be tested regarding:
Wording of questions/answers, order and structure of the questionnaire;
Problems related to translation, and cultural background ;and
Data collection mode and the involvement of an interviewer ……….V&R
14. Testing the Questionnaire-Cont’
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
14
Pre-field and field methods.
It is recommended to involve respondents.
A combination of different methods is advisable
15. 4- Revision.
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
15
After testing, making revisions to the questionnaire
Afterwards, a new round of testing is often essential.
This process may be repeated through two, three or even more phases of testing.
The aim is to check if the changes are really resulting in a higher validity and reliability of data in relation to the specific objectives of the survey.
16. 5- Data collection
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
16
With the implementation of the survey (either by a pilot study or as the real survey) the process of development and revision is terminated,
but the process of observation should be continued via
monitoring the interviewers
17. 5-Data collection-cont’
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
17
Monitoring can be regarded as :
Continuous tool of evaluation .
The monitoring of fieldwork can be essential for the further phase of post survey evaluation when conducting ongoing surveys or having implemented a new survey on full scale.
18. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
18
?
Questionnaire Testing Methods
19. Questionnaire Testing Methods
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
19
Aims
Phase of testing
Methods
1- Focus groups (FGs)
(respondent group
discussion)
20. Questionnaire Testing Methods
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
20
Aims
Phase of testing
Methods
Early stage of
questionnaire design
1- Focus groups (FGs)
(respondent group
discussion)
21. Questionnaire Testing Methods
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
21
Aims
Phase of testing
Methods
To gain a reflection of
the target population
perspective
• To check terms
Early stage of
questionnaire design
1- Focus groups (FGs)
(respondent group
discussion)
22. Questionnaire Testing Methods -Cont
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
22
Aims
Phase of testing
Methods
2- Informal test
(evaluation by
colleagues)
23. Questionnaire Testing Methods -Cont
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
23
Aims
Phase of testing
Methods
Possible in each
phase, preferably at
the beginning
2- Informal test
(evaluation by
colleagues)
24. Questionnaire Testing Methods -Cont
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
24
Aims
Phase of testing
Methods
• To detect all kinds of
mistakes: wording,
layout, skips etc.
Possible in each
phase, preferably at
the beginning
2- Informal test
(evaluation by
colleagues)
25. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
25
Aims
Phase of testing
Methods
3- Expert group
Group
discussion among
design and matter
experts, sometimes
with users
******************** ********************
Questionnaire Testing Methods -Cont
26. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
26
Aims
Phase of testing
Methods
Initial phase of
questionnaire
development
3- Expert group
Group
discussion among
design and matter
experts, sometimes
with users
******************** ********************
Questionnaire Testing Methods -Cont
27. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
27
Aims
Phase of testing
Methods
• To check concepts,
definitions,
vocabulary against
survey’s objectives
• To discuss data
processing
requirements
Initial phase of
questionnaire
development
3- Expert group
Group
discussion among
design and matter
experts, sometimes
with users
******************** ********************
Questionnaire Testing Methods -Cont
28. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
28
Focus Specifically on
Methods
• Terms and wording of questions;
• Structure of questions;
• The response alternatives;
• Order of questions;
• Navigational rules
• Instructions to interviewers .
• Confusing layout;
• Typographical errors
3- Expert group
Group
discussion among
design and matter
experts, sometimes
with users
******************** ********************
Questionnaire Testing Methods -Cont
29. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
29
Aims
Phase of testing
Methods
4- In-depth or
qualitative
interviews
(interviews with
respondents)
Questionnaire Testing Methods -Cont
30. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
30
Aims
Phase of testing
Methods
Early stage of
development and
testing the
questionnaire
4- In-depth or
qualitative
interviews
(interviews with
respondents)
Questionnaire Testing Methods -Cont
31. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
31
Aims
Phase of testing
Methods
• To evaluate
respondents’
viewpoint and
understanding of the
questionnaire
• Explorative nature
Early stage of
development and
testing the
questionnaire
4- In-depth or
qualitative
interviews
(interviews with
respondents)
Questionnaire Testing Methods -Cont
32. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
32
Aims
Phase of testing
Methods
5- Cognitive
interviews :(one- to-one
in-depth,
structured interviews
with specially trained
interviewers and
researchers)
Questionnaire Testing Methods -Cont
33. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
33
Aims
Phase of testing
Methods
Middle of the
development
process, once a draft
questionnaire has
been developed
5- Cognitive
interviews :(one- to-one
in-depth,
structured interviews
with specially trained
interviewers and
researchers)
Questionnaire Testing Methods -Cont
34. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
34
Aims
Phase of testing
Methods
• To gain qualitative
information on how a
questionnaire is
understood and
answered
Middle of the
development
process, once a draft
questionnaire has
been developed
5- Cognitive
interviews :(one- to-one
in-depth,
structured interviews
with specially trained
interviewers and
researchers)
Questionnaire Testing Methods -Cont
35. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
35
Aims
Phase of testing
Methods
6- Observational
Interviews
(observation of
respondents while
completing a
questionnaire)
Questionnaire Testing Methods -Cont
36. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
36
Aims
Phase of testing
Methods
Middle of
development, when a
tested questionnaire
exists
6- Observational
interviews
(observation of
respondents while
completing a
questionnaire)
Questionnaire Testing Methods -Cont
37. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
37
Aims
Phase of testing
Methods
• To check self completed
questionnaire by
observing potential
respondents in the lab
Middle of
development, when a
tested questionnaire
exists
6- Observational
Interviews
(observation of
respondents while
completing a
questionnaire)
Questionnaire Testing Methods -Cont
38. Questionnaire Testing Methods -Cont’
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
38
Aims
Phase of testing
Methods
7- Behaviour coding
(coding behaviour
and interaction of
interviewers and
respondents )
39. Questionnaire Testing Methods -Cont’
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
39
Aims
Phase of testing
Methods
After a set of pre-field
methods have been
conducted
7- Behaviour coding
(coding behaviour
and interaction of
interviewers and
respondents )
40. Questionnaire Testing Methods -Cont’
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
40
Aims
Phase of testing
Methods
To evaluate the
question- answering
process by
standardized methods
and coding scheme
After a set of pre- field
methods have been
conducted
7- Behaviour coding
(coding behaviour
and interaction of
interviewers and
respondents )
41. Error Detection
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
41
Completely identify the errors is virtually impossible;
therefore, the goal should be to design a testing procedure capable to catch as many errors as possible
42. Errors Detection
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
42
Q-by –Q
Analysis of single questions .
Testing by task
Assignment of specific tasks to different testers, so that each of them focuses on given issues.
Scenario testing
Hypothesizing some real situations, entering them in the questionnaire and checking the performance and the results.
43. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
43
Questionnaire Evaluation Criteria
44. Questionnaire Evaluation Criteria
The best-known and most comprehensive criteria are those from the Scientific Advisory Committee (SAC) of the Medical Outcomes Trust*….. The SAC defined
eight attributes
that deserve considerations in evaluation.
December 12, 2014
44
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
45. Questionnaire Evaluation Criteria
(SAC)
Eight Attributes
Of the good Questionnaire
?
December 12, 2014
45
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
46. Questionnaire Evaluation Criteria
(1) Conceptual and measurement model,
(2) Validity, (……,……,……)
(3) Reliability,
(4) Responsiveness,
(5) Interpretability,
(6) Respondent and administrative burden,
(7) Alternative forms, and
(8) Cultural and language adaptations (translations).
December 12, 2014
46
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
47. Specific Criteria
Within each attributes, specific criteria were defined :
(1) content validity,
(2) construct validity,
(3) criterion validity,
(4) internal consistency
(5) reproducibility,
(6) responsiveness,
(7) floor and ceiling effects, and
(8) interpretability.
December 12, 2014
47
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
48. ?
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
48
Validity
&
Reliability
49. Validity
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
49
Validity addresses the amount of systematic or "built-in" error contained in the measure.
50. Reliability
Reliability refers to random error in measurement.
Reliability indicates the accuracy or precision of the measuring instrument .
The pilot test seeks to answer the question,
Does the questionnaire consistently measure whatever it measures?
December 12, 2014
50
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
51. Reliability
Several methods ……………all involve administering the instrument to a small sample during a pilot test.
A common procedure………is the test/re-test
The instrument is given to the same group of individuals twice (about one week apart) and the two sets of scores are correlated, resulting in a coefficient of stability.
A correlation above 0.7 would indicate acceptable reliability.
Other techniques …….Cronbach's Alpha.
December 12, 2014
51
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
52. Specific Criteria
Within each attributes, specific criteria were defined :
(1) content validity,
(2) construct validity,
(3) criterion validity,
(4) internal consistency
(5) reproducibility,
(6) responsiveness,
(7) floor and ceiling effects, and
(8) interpretability.
December 12, 2014
52
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
53. 1-Content Validity
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
53
Content validity examines
the extent to which the concepts of interest are comprehensively represented by the items in the questionnaire
54. 1- Content Validity-cont;
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
54
To evaluate content validity the following aspects should be considered :
1.Measurement aim of the questionnaire
2.Target population
3.Concepts that intended to measure.
4.Item selection and item reduction
5.Interpretability of the items.
Completing the questionnaire should not require reading skills beyond that of a 12-year-old to avoid missing values and unreliable answers
55. 2- Internal Consistency
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
55
Internal consistency is a measure of the extent to which items in a questionnaire (sub)scale are correlated (homogeneous), thus measuring the same concept.
Internal consistency is an important measurement property for questionnaires that intend to measure a single underlying concept (construct) by using multiple items.
56. 3- Construct Validity
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
56
Construct Validity is used to ensure that the measure is actually measure what it is intended to measure (i.e. the construct), and not other variables. Using a panel of “experts” familiar with the construct is a way in which this type of validity can be assessed.
The experts can examine the items and decide what that specific item is intended to measure.
57. 4- Reproducibility
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
57
Reproducibility concerns the degree to which repeated measurements in stable persons (test-retest) provide similar answers.
58. 5- Criterion Validity
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
58
Criterion validity refers to the extent to which scores on a particular instrument relate to a gold standard
59. 6- Responsiveness
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
59
The ability of a questionnaire to detect important changes over time, even if these changes are small
Responsiveness is measure of longitudinal validity.
60. 6- Responsiveness-cont,
A longitudinal study with pre- and post- testing is required for determining responsiveness.
It is important in this process that the design and potential intervention being used represent the situation in which the questionnaire will be used in the future.
December 12, 2014
60
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
61. 7- Floor or Ceiling Effects
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
61
The number of respondents who achieved the lowest (floor/ground/basement) or highest (ceiling) possible score.
Calculation : The % of subjects who achieved the maximum score (ceiling) or the minimum score (floor).
These effects are considered when 15% of respondents reach the ceiling or floor scores.
Their effects :implications on the questionnaire’s reproducibility and responsiveness
62. 7- Floor or Ceiling effects
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
62
The number of respondents who achieved the lowest or highest possible score.
If floor or ceiling effects are present, it is likely that extreme items are missing in the lower or upper end of the scale, indicating limited content validity. As a consequence, subjects with the lowest or highest possible score cannot be distinguished from each other, thus reliability is reduced.
Furthermore, the responsiveness is limited because changes cannot be measured in these patients.
63. A floor effect
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
63
A Floor Effect is when most of your subjects score near the bottom. There is very little variance because the floor of your test is too high.
The question is too hard for the group you are testing.
64. A ceiling effect
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
64
A Ceiling Effect :Most of subjects score near the top. There is very little variance because the ceiling of your test is too low.
The question is too easy for the group you are testing.
65. 8- Interpretability
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
65
Interpretability is defined as
the degree to which one can assign qualitative meaning to quantitative scores.
66. Systematic Evaluation of Quest.
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
66
How do we find questionnaire problems????????
By conducting a s tructured expert review
( Technical Review )
Example: QAS-99
67. Questionnaire Appraisal System
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
67
Questionnaire Appraisal System
(QAS-99)
QAS-99 is based on a system that developed for Behavioral Surveillance Branch of the Centers for Disease Control and Prevention for use in evaluating questions for the Behavioral Risk Factor Surveillance System (BRFSS).
http://www.cdc.gov/nccdphp/brfss
68. Questionnaire Appraisal System
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
68
For structured expert reviews a set of criteria (coding schemes ) by which each question is to be examined are implemented.
The coding scheme needs to be filled in for each question and consequently the draft questionnaire needs to be completed by the standardized coding categories .
69. QAS-99 ( Items )
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
69
1. PROBLEMS WITH READING:.
2. PROBLEMS WITH INSTRUCTIONS
3. PROBLEMS WITH ITEM CLARITY:
4. PROBLEMS WITH ASSUMPTIONS
5. PROBLEMS WITH KNOWLEDGE/MEMORY:
6. PROBLEMS WITH SENSITIVITY/BIAS:
7. PROBLEMS WITH RESPONSE CATEGORIES
70. 1. PROBLEMS WITH READING:
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
70
Determine if it is difficult for the interviewers to read the question uniformly to all respondents.
1a – What to read: Interviewers may have difficulty determining what parts of the question are to be read.
1b – Missing information: Information the interviewer needs to administer the question is not contained in the question.
1c – How to read: Question is not fully scripted and therefore difficult to read.
71. 2. PROBLEMS WITH INSTRUCTIONS:
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
71
Look for problems with any introductions, instructions, or explanations from the respondent’s point of view.
2a – Conflicting or inaccurate instructions, introductions, or explanations.
2b – Complicated instructions, introductions, or explanations
72. 3. PROBLEMS WITH ITEM CLARITY:
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
72
Identify problems related to communicating the intent or meaning of the question to the respondent
3a – Wording: The question is lengthy, awkward, ungrammatical, or contains complicated syntax.
3b – Technical terms are undefined, unclear or complex.
3c – Vague: The question is vague because there are multiple ways in which to interpret it or to determine what is to be included and excluded.
3d – Reference periods are missing, not well specified, or are in conflict.
73. 4. PROBLEMS WITH ASSUMPTIONS:
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
73
Determine there are problems with assumption if s made or the underlying logic.
4a – Inappropriate assumptions are made about the respondent or his/her living situation.
4b – Assumes constant behaviour: The question inappropriately assumes a constant pattern of behaviour or experience for situations that in fact vary.
4c – Double-barrelled question that contains multiple implicit questions.
74. 5. PROBLEMS WITH KNOWLEDGE/MEMORY:
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
74
Check whether respondents are likely to not know or have trouble remembering information.
5a – Knowledge: The respondent is unlikely to know the answer.
5b – An attitude that is asked about may not exist.
5c – Recall failure.
5d – Computation or calculation problem
75. 6. PROBLEMS WITH SENSITIVITY/BIAS
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
75
: Assess questions for sensitive nature or wording, and for bias.
6a – Sensitive content: The question is on a topic that people will generally be uncomfortable talking about.
6b – A socially acceptable response is implied.
76. 7. PROBLEMS WITH RESPONSE CATEGORIES:
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
76
Asses the adequacy of the range of responses to be recorded.
7a – Open-ended question that is inappropriate or difficult.
7b – Mismatch between question and answer categories.
7c – Technical terms are undefined, unclear, or complex.
7d – Vague response categories.
7e – Overlapping response categories.
7f – Missing response categories.
7g –Illogical order of response categories
77. QAS-99 ( Reference)
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
77
Question Appraisal System QAS-99 . By:
Gordon B.Willis and Judith T. Lessler . Research Triangle Institute Suite 420 6110 Executive Blvd. Rockville, MD 20852 August, 1999
http://www.cdc.gov/nccdphp/brfss
Biemer, P.P. and Lyberg, L.E. (2003). Introduction to Survey Quality. Hoboken, New Jersey: John Wiley&Sons
78. Cited References
December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
78
http://appliedresearch.cancer.gov/areas/cognitive/qas99.pdf
http://www.joe.org/joe/2007february/tt2.php
https://readability-score.com/
http://www.joe.org/joe/1990summer/tt2.php
http://www.emgo.nl/kc/preparation/research%20design/ 8%20Questionnaires%20selecting,%20translating%20and%20validating.htm
Scientific Advisory Committee of the Medical Outcomes Trust. Assessing health status and quality-of-life instruments: attributes and review criteria. Qual Life Res 2002;11:193e205.
http://ec.europa.eu/eurostat/ramon/statmanuals/files/Handbook_of_Practices_for_Quest.pdf
http://appliedresearch.cancer.gov/areas/cognitive/qas99.pdf
79. December 12, 2014
Dr. AhmedRefat *** WWW.Slideshare.net/AhmedRefat
79
Thank You