I/O chapter 2 by Jason Manaois
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

I/O chapter 2 by Jason Manaois

on

  • 401 views

I/O chapter 2 by Jason Manaois

I/O chapter 2 by Jason Manaois

Statistics

Views

Total Views
401
Views on SlideShare
401
Embed Views
0

Actions

Likes
0
Downloads
2
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

I/O chapter 2 by Jason Manaois Presentation Transcript

  • 1. Research in Psychology A Scientific Endeavor
  • 2. Objectives Describe the goals of psychological research Learn the process of doing research Understand the different research methods Explore the advantages and disadvantages of doing research and its different methodologies Apply ethical consideration in doing research
  • 3. Definition RESEARCH  A formal process by which knowledge is produced and understood GENERALIZABILITY  The extent to which conclusions drawn from one research study spread or apply to a larger population.
  • 4. Goals of Psychological Research Description of social behavior  Are people who grow up in warm climates different from those in cold climates? Establish a relationship between cause & effect  Does heat cause higher amounts of aggression? Develop theories about why people behave the way that they do  We dislike Duke students to feel better about ourselves Application  Creating effective therapeutic treatments, more successful negotiation tactics, and greater understanding amongst groups of people
  • 5. Empirical Research Cycle
  • 6. Empirical Research Empirical  Knowledge based on direct observation Theory  Set of ideas which try to explain what we observe  Theoretical diversity  A statement that proposes to explain relationship among phenomena of interest.
  • 7. Statement of a Problem Inductive Method  A research process in which conclusions are drawn about a general class of objects or people based on knowledge of a specific member of the class under investigation.  DATA THEORY Deductive Method  A research process in which conclusions are drawn about a specific member of a class of objects or people based on knowledge of the general class under investigation.  THEORY Collects DATA
  • 8. Design of the Research StudyResearch Design A plan for conducting scientific research for the purpose of learning about the phenomenon of interestInternal Validity The degree to w/c the relationship evidenced among the variables in a particular research study are accurate or true.External Validity The degree to w/c the relationship evidenced among the variables in a particular research study are generalizable or accurate in other contexts.
  • 9. Concerns Naturalness of the Research Setting  Laboratory vs. Field Degree of Control  Are you able to control or manage the conduct of the research?
  • 10. Primary Research Laboratory Experiment Quasi-experiment Questionnaire ObservationSecondary Research META-analysis
  • 11. Qualitative Research Ethnography  A research method that utilizes field observation to study a society’s culture  EMIC – insider’s view  ETIC – external view
  • 12. Measurement of Variables Variables Quantitative Variables  E.g. Age, weight Categorical Variables  E.g. Gender, race
  • 13. Variables Independent Variables  A variable that can be manipulated to influence the values of the dependent variable. (the one that you manipulate) Dependent Variables  A variable whose values are influenced by the independent variables. (the one that you measure)
  • 14. Examples Leadership Employee Style Performance Employee Employee Performance Trainability
  • 15. Variable in I/O Predictor Variable  A variable used to predict or forecast a criterion variable Criterion Variable  A variable that is a primary object of a research study; it is forecasted by a predictor variable
  • 16. Examples Personality Leadership Style Employee Employee Performance Morale
  • 17. Analysis of Data Descriptive Statistics  Mean  Median  Mode Variability  Range  SD  Correlation
  • 18. Conclusions from Research After collection and analyzing data, the researcher draws conclusions. Answers research hypothesis or research problem. Generalizability of research findings???
  • 19. Ethical Issues Right to informed consent Right to privacy Right to confidentiality Right to protection from deception Right to debriefing
  • 20. Another Perspective
  • 21. The Process of Doing Research First, select a topic  Good theory:  Has predictive power  Is simple & straightforward Then, search the literature  Find out what others have done that may be applicable to your area of interest
  • 22. The Process of Doing Research Next, formulate hypotheses  Hypothesis: specific statement of expectation derived from theory  State the relationship between two variables  Variable: can be any event, characteristic, condition, or behavior
  • 23. The Process of Doing Research Then pick your research method  Experimental vs. correlational (Design) Design  Field vs. laboratory (Setting) Setting Finally, collect & analyze your data
  • 24. Let’s take a closer look . . .at variables Dependent variable (outcome variable)  Dependent on the influence of other factor(s)  How do we operationalize? Independent variable (predictor variable)  Factor(s) that change the outcome variable  How do we operationalize & manipulate?  Control group
  • 25. Let’s take a closer look . . . at researchmethodsExperimental vs. correlational designs Correlational: observe the relationship between two variables  Describe patterns of behavior Types include  Naturalistic observation  Case studies  Surveys
  • 26. Correlational research Advantages  Sometimes manipulation of variables is impossible or unethical  Efficient – look at lots of data Disadvantages  CANNOT DETERMINE CAUSATION  Could be a lurking variable
  • 27. Experimental Research Researcher manipulates one variable (IV) to see effect on other variable (DV)  Try to hold everything else constant True experiments have  Random sampling: selecting Ps randomly from population  Random assignment: chance assignment to condition
  • 28. Ethics in Research Should the study be done?  Value vs. potential cost  APA guidelines, colleagues How do we protect Ps?  Informed consent  Confidentiality & anonymity  Debriefing
  • 29. Organizational ResearchMethods:CAUSALITY
  • 30. What Do We Mean ByCausality? Relationship between two events where one is a consequence of the other Determinism: A (cause) leads to B (effect) “In the strict formulation of the law of causality—if we know the present, we can calculate the future—it is not the conclusion that is wrong but the premise”. On an implication of the uncertainty principle. Werner Heisenberg
  • 31. Heisenberg & UncertaintyPrinciple Certain properties of subatomic particles are linked so the more accurately you know one, the less accurately you know the other  We can compute probabilities not certainties  Argues against determinism “Physics should only describe the correlation of observations; there is no real world with causality” Heisenberg, 1927, Zeitschrift für Physik  Psychology, like quantum physics, is probabalistic
  • 32. Cause Versus Effect Effect of a Cause (Description)  What follows a cause? Cause of an Effect (Explanation)  Why did the effect happen?Holland, P. W. (1988). Causal inference, path analysis, and recursive structural equations models. Sociological Methodology , 18, 449-484.
  • 33. Three Elements of CausalCase Cause and effect are related Cause preceded effect No plausible alternative explanations John Stuart Mill
  • 34. Experiment Vary something to discover effects  Shows association  Shows time sequence  Can rule out only some alternatives  Confounds  Boundary conditions (generalizability) Good for causal description not explanation Natural science control through precise measurement  Sterile test tubes, electronic instruments
  • 35. Studying and Performance Students randomly assigned to study amount Test scores as DV Did studying lead to test results?  Encouragement led to test results  Impact on studying unclear  Effect of studying unclear What was cause of test results?Holland, P. W. (1988). Causal inference, path analysis, and recursive structural equations models. Sociological Methodology, 18, 449-484.
  • 36. Nonexperimental ResearchStrategy1. Determine covariation2. Test for time sequence • Longitudinal design • Quasi-experiment1. Rule out plausible alternatives • Based on data/theory • Logical
  • 37. Can Job Satisfaction CauseGender? Correlation of Gender and satisfaction = group mean differences Satisfaction can’t cause someone’s gender Satisfaction can be the cause of gender distribution of a sample Suppose Females have higher satisfaction than Males Multiple reasons
  • 38. Alternative Gender-JobSatisfaction Model Females more likely to quit dissatisfying jobs Dissatisfaction causes gender distribution Gender moderates relation of satisfaction with quitting
  • 39. . . . . . . . . . . . . . . .Satisfaction . . . . . . . . . . . … . . . . . Females Males
  • 40. More Alternatives1. Women less likely to take dissatisfying job (better job decisions)2. Women less likely to be hired into dissatisfying jobs (protected)3. Women less likely to be bullied/mistreated4. Women given more realistic previews (lower expectations)5. Women more socially skilled at getting what they want at work
  • 41. How To Use ControlsControls great devices to testhypotheses/theoryRule in/out plausible alternativesBest based on theorySequence of tests
  • 42. Control Strategy1. Test that A and B are related • Salary relates to job satisfaction1. Confirm/disconfirm control variable • Gender relates to both3. Generate/test alternative explanations for control variable • Differential expectations • Differential hiring rate • Differential job experience • Differential turnover rate
  • 43. Validity and Threats To Validity Validity  Interpretation of constructs/results  Inference based on purpose  Hypothesized causal connections among constructs  Nature of constructs  Population of interest  People  Settings
  • 44. Four Types of Design Validity Statistical conclusion  Appropriate statistical method to make desired inference Internal validity  Causal conclusions reasonable based on design Construct validity  Interpretation of measures External validity  Generalizeability to population of interest
  • 45. Threats to Validity Statistical Conclusion  Statistics used incorrectly  Low power  Poor measurement Internal Validity  Confounds of IV with other events, variables  Group differences (pre-existing or attrition)  Lack of temporal order  Instrument changes
  • 46. Threats To Validity 2Construct Validity Inadequate specification of theoretical construct Unreliable measurement Biases Poor content validityExternal Validity Inadequate specification of population Poor sampling of population  Subjects  Settings
  • 47. Qualitative Methods What are qualitative methods  Collection/analysis of written/spoken text  Direct observation of behavior Participant observation Case study Interview Written materials  Existing documents  Open-ended questions
  • 48. Qualitative Research 2 Accept subjectivity of science  Is this an excuse? Less driven by hypothesis Assumption that reality a social construction  If no one knows I’ve been shot, am I really dead? Interested in subject’s viewpoint More open-ended More interested in context Less interested in general principles Focus more on interpretation than quantification
  • 49. Analysis Content Analysis  Interviews  Written materials  Open-ended questions  Audio or video recordings  Quantifying  Counts of behaviors/events  Categorization of incidents  Multiple raters with high agreement Nonquantitative  Analysis of case  Narrative description
  • 50. The Value of the QualitativeApproach What is the value/use of this approach? Is this science? Must everything be quantified?
  • 51. Qualitative Organizational Research:Job Stress Quantitative survey dominates Role ambiguity and conflict dominated in 1980s & 1990s (Katz & Kahn) Dominated by Rizzo et al. weak scales
  • 52. Keenan & Newton’s SIR Stress Incident Record  Describe event in prior 2 weeks  Aroused negative emotion Top stressful events for engineers  Time/effort wasted  Interpersonal conflict  Work underload  Work overload  Conditions of employment
  • 53. Subsequent SIR ResearchComparison of occupations  Clerical: Work overload, lack of control  Faculty: Interpersonal conflict, time wasters  Sales clerks: Interpersonal conflict, time wastersInformed subsequent quantitative studies  Focus on more common stressors  Interpersonal conflict  Organizational constraints
  • 54. Cross-Cultural SIR Research Comparison of university support staff India vs. U.S. Stressor India USOverload 0% 25.6%Lack of control 0% 22.6%Lack of structure 26.5% 0%Constraints (Equipment) 15.4% 0%Conflict 16.5% 12.3%
  • 55. Research As Craft Scholarly research as expertise not bag of tricks Logical case Go beyond sheer technique  Research not just formulaic/trends  Not just using right design, measures, stats
  • 56. Developing the Craft Experience Trying different things  Constructs  Designs/methods  Problems  Statistics Reading Reviewing Teaching Thinking/discussing Courses necessary but not sufficient Lifelong learning—you are never done
  • 57. Developing the Craft Field values novelty and rigor Don’t be afraid of exploratory research  Not much contribution if answer known in advance Look for surprises Don’t be afraid to follow intuition Ask interesting question without a clear answer Focus on interesting variables Good papers tell stories  Variables are characters  Relationships among variables
  • 58. Construct & External ValidityandMethod Variance
  • 59. Constructs Theoretical level  Conceptual definitions of variables  Basic building blocks of theories Measurement level  Operationalizations  Based on theory of construct
  • 60. What We Do With Constructs Define Operationalize/Measure Establish relations with other constructs  Covariation  Causation
  • 61. External Validity: Population Link between sample and theoretical population Define theoretical population Identify critical characteristics Compare sample to population  Employed individuals  Do students qualify?
  • 62. External Validity: Setting Link between current setting and other settings  Organization  Occupation Identify critical characteristics of settings Compare setting to others  Lab to field
  • 63. External Validity: Treatment/IV Link between current treatment/IV and others Compare treatment/IV  Distance learning vs. traditional
  • 64. External Validity: Outcome/DVLink between current outcome/DV andothersWill results in study work similarly innonresearch condition?Will different operationalizations ofoutcome have same result? Supervisor rating of performance vs. objective Safety behavior versus accidents/injuries
  • 65. When Politics Attack Science Evolution IQ and performance Differential validity of IQ tests Others?
  • 66. Quasi-Experimental Design What is an experiment?  Random assignment  Creation of Conditions?  Naturally occurring experiment
  • 67. Quasi-experiment Design without random assignment Comparison of conditions Researcher created or existing Can characteristics of people be an IV?  Gender  Personality Is a survey a quasi-experiment?
  • 68. Settings Laboratory vs. field Laboratory  Setting in which phenomenon doesn’t naturally occur Field  Setting in which phenomenon naturally occurs Classroom field for educational psychologist Classroom lab for us
  • 69. Lab vs. FieldStrengths/Weaknesses Lab  High level of control  Easy to do experiments  Limits to what can be studied  Limited external validity of population/setting Field  Limited control  Difficult to do experiments  Wide range of what can be studied  High reliance on self-report  High external validity
  • 70. Lab in I/O Research What’s the role of lab in I/O research? Stone suggests lab is as generalizeable as field. Do you agree? Stone says I/O field biased against lab. Is it? When should we do lab vs. field studies?
  • 71. Challenges To Field Research Access to organizations/subjects Lack of control  Distal contact with subjects (surveys)  Who participates  Contaminating conditions  Participants discussing study Lack of full cooperation Organizational resistance to change
  • 72. Survey Methods & Constructs Survey methods Sampling Cross-cultural challenges  Measurement equivalence/invariance
  • 73. Survey Settings Within employer organization Within other organization  University  Professional association  Community group  Club General population  Phone book  Door-to-door
  • 74. Methods Questionnaire  Paper-and-pencil  E-mail  Web Interview  Face-to-face  Phone  Video-phone  E-mail  Instant Message
  • 75. Population Single organization Multiple organizations  Within industry/section Single occupation Multiple occupations General population  Employed students
  • 76. Sample Versus PopulationSurvey everyone in population vs. sample Single organization or unit of organization  Often survey goes to everyone Multiple organizations  Kessler: All psychology faculty Other organization  Professional association  Often survey everyone General population
  • 77. Sampling Definitions Population – Aggregate of cases meeting specification  All humans  All working people  All accountants  Not always directly measurable Sampling frame – List of all members of a population to be sampled  List of all USF support personnel
  • 78. Sampling Definitions cont. Stratum – Segment of a population Divided by a characteristic  Demographics  Male vs. female  Job level  Manager vs. nonmanager  Job title  Occupation  Department/division of organization
  • 79. Instrument Issues Linguistic meaning  Translation – Back-translation Calibration  Numerical equivalence  Cultural response tendencies  Asian modesty  Latin expansiveness Measurement equivalence  Construct validity  Factor Structure
  • 80. What Is A Theory?Bernstein  Set of propositions that account for predict and control phenomenaMuchinsky  Statement that explains relationships among phenomenaWebster  General or abstract principles of science  Explanation of phenomena
  • 81. Types of Theories Inductive  Starts with data  Theory explains observations Deductive  Starts with theory  Data used to support/refute theory
  • 82. Common Usage of Theory Conjecture, opinion, speculation or hypothesis  Wikipedia
  • 83. Advantages Integrates and summarizes large amounts of data Can help predict Guides research Helps frame good research questions
  • 84. Disadvantages Biases researchers “Theory, like mist of eyeglasses, obscures facts” (Charlie Chan in Muchinsky) “Facts are the enemy of truth” (Levine’s boss) A distraction as research does not require theory (Skinner)
  • 85. Hypothesis Statement of expected relationships among variables Tentative More limited than a theory Doesn’t deal with process or explanation
  • 86. Model Representation of a phenomenon Description of a complex entity or process  Webster Boxes and arrows showing causal flow
  • 87. Theoretical Construct Abstract representation of a characteristic of people, situation, or thing Building blocks of theories
  • 88. Paradigm Accepted scientific practice Rules and standards for scientific practice Law, theory, application and instrumentation that provide models for research.  Thomas Kuhn
  • 89. What Are Our Paradigms? Behaviorism? Environment-perception-outcome approach Surveys
  • 90. Ethics In Research
  • 91. Ethical Practices Conducting Research  Treatment of human subjects  Treatment of organizational subjects Data Analysis/Interpretation Disseminating Results  Publication Peer reviewing
  • 92. Ethical Codes Appropriate moral behavior/practice Accepted practices Basic Principle: Do no harm Protect dignity, health, rights, well-being Codes  APA??
  • 93. American Psychological AssociationCode Largely Practice oriented Five principles  Beneficence and Nonmaleficence [Do no harm]  Fidelity and Responsibility  Integrity  Justice  Respect for People’s Rights and Dignity Standards and practices Applies to APA members http://www.apa.org/ethics/
  • 94. PreamblePsychologists are committed to increasing scientific and professional knowledge of behavior and peoples understanding of themselves and others and to the use of such knowledge to improve the condition of individuals, organizations, and society. Psychologists respect and protect civil and human rights and the central importance of freedom of inquiry and expression in research, teaching, and publication. They strive to help the public in developing informed judgments and choices concerning human behavior. In doing so, they perform many roles, such as researcher, educator, diagnostician, therapist, supervisor, consultant, administrator, social interventionist, and expert witness.
  • 95. APA Conflict BetweenProfession and EthicalPrinciples Restriction of Advertising  Violation of the law Maximization of income for members Tolerance of torture  Convoluted statements Other associations manage to avoid such conflicts
  • 96. Academy of Management Code Largely academically oriented Three Principles Responsibility Integrity Respect for people’s rights and dignity Responsibility to Students Advancement of managerial knowledge AOM and larger profession Managers and practice of management All people in the world http://www.aomonline.org/aom.asp?ID=&page_ID=239
  • 97. Professional PrinciplesOur professional goals are to enhance the learning of students and colleagues and the effectiveness of organizations through our teaching, research, and practice of management.
  • 98. Principles Vs. Practice Principles clear in theory Ethical line not always clear Ethical dilemmas  Harm can be done no matter what is done  Conflicting interests between parties  Employee versus organization  Whose rights take priority?
  • 99. Example: ExploitiveRelationships Principle  Psychologists do not exploit persons over whom they have supervisory, evaluative, or other authority What does it mean to exploit?
  • 100. Conducting ResearchPrivacyInformed consentSafetyDebriefingInducements
  • 101. Privacy Anonymity: Best protection  Procedures to match data without identities Confidentiality  Security of identified data  Locked computer/cabinet/lab  Encoding data  Code numbers cross-referenced to names  Removing names and identifying information
  • 102. Informed ConsentSubject must know what is involved  Purpose  Disclosure of risk  Benefits of research  Researcher/society  Subject  Privacy/confidentiality  Who has access to data  Who has access to identity  Right to withdraw  Consequences of withdrawal
  • 103. Safety Minimize exposure to risk  Workplace safety study: Control group Physical and psychological risk
  • 104. Debriefing Subject right to know Educational experience for students Written document Presentation Surveys: Provide contact for follow-up Provide results in future upon request
  • 105. Inducements Pure Volunteer – no inducement Course requirement  Is this coercion? Extra credit Financial payment  Is payment coercion?
  • 106. Successful Research Career Conducting good research  Lead don’t follow Visibility  Good journals  Conferences  Other outlets  Quantity First authored publications  Important more early in career Impact Grants
  • 107. Programmatic Program of research  More conclusive  Multiple tests  Boundary conditions  More impact through visibility  Helps getting jobs  Helps with tenure/promotion  Can have more than one focus
  • 108. Conducting SuccessfulResearch Develop an interesting question  Based on theory  Based on literature  Based on observation  Based on organization need Link question to literature  Theoretical perspective  Place in context of what’s been done  Multiple types of evidence  Consider other disciplines
  • 109. Conducting Successful Research2 Design one or more research strategies  Lab vs. field  Data collection technique  Survey, interview, observation, etc.  Design  Experimental, quasi-experimental or observational  Cross-sectional or longitudinal  Single-source or multisource  Instrumentation  Existing or ad hoc
  • 110. Conducting Successful Research3 Analysis Hierarchy of methods simple to complex  Descriptives  Bi-variable relationships  Test for controls  Complex relationships  Multiple regression  Factor analysis  HLM  SEM
  • 111. Conducting Successful Research4 Conclusions  What’s reasonable based on data  Alternative explanations  Speculation  Theoretical development  Suggestions for future
  • 112. ImpactEffect of work on field/worldCitations Sources  ISI Thomson  Harzing’s Publish or Perish  Others Self-citation Citation studies  Individuals (e.g., Podsakoff et al. Journal of Management 2008)  Programs (e.g., Oliver et al. TIP, 2005)Being attacked