Psyc327.Show2
Upcoming SlideShare
Loading in...5
×
 

Psyc327.Show2

on

  • 571 views

 

Statistics

Views

Total Views
571
Views on SlideShare
571
Embed Views
0

Actions

Likes
1
Downloads
5
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Psyc327.Show2 Psyc327.Show2 Presentation Transcript

    • JANET’S NOTES FROM: I/O PSYCHOLOGY UNDERSTANDING THE WORKPLACE BY PAUL E. LEVY Ch 2: Research Methods
    • How to Open This Show…
      • To look at this slide show later:
      • Log-in to www.slideshare.net
      • ( Slide Share . NET )
      • Username : TUPSYC327
      • ( TU PSYC 327 )
      • Password : PASSWORD
    • The 4 Goals of Science
      • Using the scientific process or method to generate a body of knowledge, follow these 4 steps:
      • Description
        • The accurate portrayal or depiction of the phenomenon of interest
      • Explanation
        • Gathering knowledge about why the phenomenon exists or what causes it
      • Prediction
        • The ability to anticipate an event prior to its actual occurrence.
      • Control
        • The manipulation of antecedent conditions to affect behavior. (Antecedent = prior to)
    • The 3 Assumptions of Science
      • Empiricism
        • The notion that the best way to understand behavior is to generate predictions based on theory, to gather data, and to use the data to test these predictions.
      • Determinism
        • Suggests that behavior is orderly and systematic and doesn’t just happen by chance
      • Discoverability
        • Suggests not only that behavior is orderly but also that this orderliness can be discovered
    • Other Definitions of Science Found on Urban Dictionary.com
      • What she blinded me with. Whoever "she" could be. She blinded me...with science!
      • An attempt to understand the world. Often contains big words that are unnecessary.
      • ----"Well, It appears science has failed again, in front of overwhelming religious evidence." --Reverend Lovejoy (The Simpsons)
    • The 5 Parts of A Theory
      • Parsimony
        • The theory should be able to explain a lot in as simple a way as possible
      • Precision
        • The theory should be specific and accurate in its wording and conceptual statements so that everyone knows what its propositions and predictions are
      • Testability
        • The propositions presented in the theory must be verifiable by some sort of experimentation
      • Usefulness
        • A theory should be practical and help in describing, explaining, and predicting an important phenomenon
      • Generativity
        • The theory should stimulate research that attempts to support or refute its propositions
    • Which Occurs First: Data or Theory?
      • Sir Francis Bacon in 1600’s believed in empirical observation – data was key
      • Induction
        • An approach to science that consists of working from data to theory
      • Deduction
        • An approach to science in which we start with theory and propositions and then collect data to test those propositions – working from theory to data.
      • Most scientists combine both types of observation, but initially, most research is driven by inductive processes
      • Refer to Fig 2.1 on P. 29
    • Research Terminology and Basic Concepts: Independent and Dependent Variables
      • Casual Inference
        • A conclusion, drawn from research data, about the likelihood of a causal relationship between two variables.
      • Independent Variable
        • A variable that is systematically manipulated by the experimenter or, at the least, measured by the experimenter as a precursor to other variables.
      • Dependent Variable
        • The variable of interest, or what we design our experiments to assess.
      • Extraneous Variable
        • Anything other than the independent variable that can contaminate our results or be thought of as an alternative to our causal explanation.
    • Research Terminology and Basic Concepts: Internal and External Validity
      • Internal Validity
        • The extent to which we can draw causal inferences about our variables.
      • External Validity
        • The extent to which the results obtained in an experiment generalize to other people, settings, and times.
    • Research Terminology and Basic Concepts: Control
      • Why do we need control in research?
        • We need to be sure that other potential explanations aren’t affecting our results.
      • Ways to control for extraneous variables:
      • 1. The potentially extraneous variable can be held constant in our experiment
      • 2. We can systematically manipul a te different levels of the variable
      • 3. We can use statistical control, such as the analysis of convariance, to remove or control the variability in our dependent variable that is due to the extraneous variable.
    • The Research Process
      • Hypothesis
        • A tentative statement about the relationship between two or more variables
      • Stages of the Research Process:
      • Formulate the Hypothesis
      • Design the Study
      • Collect Data
      • Analyze the Data
      • Report the Findings
    • Types of Research Designs: Experimental Methods
      • Experimental Methods
        • Research procedures that are distinguished by random assignment of participants to conditions and the manipulation of independent variables
      • Random Assignment
        • The procedure by which research participants, once selected, are assigned to conditions such that each one has an equally likely chance of being assigned to each condition
      • Manipulation
        • The systematic control, variation, or application of independent variables to different groups of participants
    • Types of Research Designs: Field Experiments and Quasi-Experiments
      • Quasi-Experiment
        • A research design that resembles an experimental design but does not include random assignment.
        • Often use intact groups
        • More feasible to conduct inside companies but still allow for reasonable levels of internal validity
    • Types of Research Designs: Observational Methods
      • Observational Methods are sometimes called “Correlational Designs”
        • Results usually analyzed by correlational approaches
        • (refer to statistics part of Chapter 2)
        • Descriptive research is important because although we cannot infer causality from such research, we can gather data that can be used to generate more causal hypotheses, which can be examined with experimental designs (p.39).
    • Data Collection Techniques: Naturalistic Observation
      • Unobtrusive Naturalistic Observation
        • An observational technique whereby the researcher unobtrusively and objectively observes individuals but does not try to blend in with them
      • This type of research is used more often by I/O Psychologists, rather than Participant Observation (blending in).
      • The researcher must be aware of the possibility that she is affecting behaviors and interactions through her observation.
    • Data Collection Techniques: Case Studies
      • Case Studies
        • Examinations of a single individual, group, company, or society
      • Similar to Naturalistic Observation
      • Might involve interviews, historical analysis, or research into the writings or policies of an individual or organization.
      • The main purpose of case studies is description (as with other observational methods), although explanation is a reasonable goal of case studies, too.
    • Data Collection Techniques: Archival Research
      • Archival Research
        • Research relying on secondary data sets that were collected either for general or specific purposes indentified by an individual or organization.
      • The quality of research using an archival data set is strongly affected by the quality of that original study.
      • Steps 2 and 3 of the Research Process are already complete to the researcher, shortening the length of time for the study.
    • Data Collection Techniques: Surveys
      • Surveys
        • A data collection technique that involves the selection of a sample of respondents and the administration of some type of questionnaire
      • Most frequently used method of data collection in I/O Psychology
      • Self Administered Questionnaires – easy to administer and can be given to large groups. Also provides anonymity to users.
      • Interviews – more time consuming but response rate is higher than self-administered questionnaires.
    • Measurement: Reliability
      • Measurement
        • The assignment of numbers to objects or events using rules in such a way as to represent specified attributes of the objects
      • Attribute
        • A dimension along which individuals can be measured and along which individuals vary
      • Reliability
        • The consistency or stability of a measure
      • Test-Retest Reliability
        • The stability of a test over time; often referred to as a coefficient of stability
    • Measurement: Reliability
      • Parallel Forms Reliability
        • The extent to which two independent forms of a test are equivalent measures of the same construct
      • Interrater Reliability
        • The extent to which multiple raters or judges agree on ratings made about a particular person, thing, or behavior
      • Internal Consistency
        • An indication of the extent to which individual test items seem to be measuring the same thing
    • Measurement: Validity of Tests, Measures, and Scales
      • Construct Validity
        • The extent to which a test measures the underlying construct that it was intended to measure
      • Construct
        • An abstract quality, such as intelligence or motivation, that is not observable and is difficult to measure
      • Content validity
        • The degree to which a test or predictor covers a representative sample of the quality being assessed
      • Predictive Validity
        • The extent to which test scores obtained at one point in time predict criteria obtained in the future.
    • Measurement: Validity of Tests, Measures, and Scales
      • Concurrent Validity
        • The extent to which a test predicts a criterion that is measured at the same time that the test is conducted
      • Convergent Validity
        • The degree to which a measure of the construct in which we are interested is related to measures of other, similar constructs
      • Divergent Validity
        • The degree to which a measure of the construct in which we are interested is NOT related to measures of other, dissimilar constructs
    • Statistics: Measures of Central Tendency
      • Statistic
        • An efficient device for summarizing in a single number the value, characteristics, or scores describing a series of cases
      • Mode
        • The most frequent single score in a distribution
      • Median
        • The score in the middle of the distribution
      • Mean
        • The arithmetic average of a group of scores, typically the most useful measure of central tendency
    • Other Definitions of the Word “Statistics”: Found on Urban Dictionary .com
      • The study of percentages, bars, graphs, and charts, all in an attempt to make some sort of logical conclusion out of a bunch of numbers so that even more percentages, bars, graphs, and charts can be made. Statistics may not be of practical use in everyday life, unless you own a casino.
      • The mathematical study of the distribution of data on any subject to prove any point you're trying to make, when whining about it doesn't work. 39% of all statistics are made up.
      • Also known as a useless subject one is forced to do to complete one's degree in Capitalism 101 so one can get a piece of paper at the end of four years to be part of a globalised world so one can feed one's children (if one chooses to have any) and live happily ever after. So if one does not do statistics one will lead a sad and lonely life. Look at all the happy people, want to know their secret? It’s statistics.
    • Statistics: Measures of Dispersion
      • Range
        • The simplest measure of dispersion, reflecting the spread of scores from the lowest to the highest
      • Variance
        • A useful measure of dispersion reflecting the sum of the squared differences between each score and the mean of the group
      • Standard Deviation
        • A measure of dispersion that is calculated as the square root of the variance
    • Statistics: Shapes of Distributions
      • Normal Distribution
        • A mathematically based distribution depicted as a bell-shaped curve
    • Statistics: Correlation and Regression
      • Correlation Coefficient
        • A statistic that measures the strength and direction of the relationship between two variables
      • Coefficient of Determination
        • The percentage of variance in a criterion that is accounted for by a predictor
      • The example to the right is a “Coefficient of Determination” graph from the Maryland Water Science Center
    • Statistics: Meta-Analysis
      • Meta-Analysis
        • A methodology that is used to conduct quantitative literature reviews
      • A way to combine different findings and provide the best estimate of the true relationship between job satisfaction and job performance.
      • Combines information from 25, 100, or even 1000 studies to arrive at the best estimate of true relationship.
    • Fantabulous Internet Links
      • http://www.socialpsychology.org/io.htm
        • Dozens of great internet sites for I/O Psychology
      • http://www.siop.org/tip/TIP.aspx
        • Society for I/O Psychology Website
      • http://www.siop.org/GTP/
        • Graduate Training Programs in I/O Psychology
      • http://www.urbandictionary.com/
        • Use 2-3 times daily for 7 days, and you will laugh more!