• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content







Total Views
Views on SlideShare
Embed Views



2 Embeds 29

http://ps372.wordpress.com 25
http://bilog.co 4



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment

    Measurement Measurement Presentation Transcript

    • Measurement
      • Andrew Martin -- PS 372
    • What is measurement?
      • Measurement is the process by which phenomena are observed systematically and represented by scores and numerals.
    • Operationalization
      • Political scientists have figure out how to measure the presence of absence of concepts in the real world.
      • The process requires political scientists to provide an operational definition of their concepts. Sometimes this is called operationalization .
    • © Judith A. Perrolle 1987
    • Operationalization Examples
      • Concept: Fundraising
      • Operational definition: Itemized receipts of daily campaign contributions made to each presidential candidate (FEC Database) (Haynes, Crespin, Zorn (2004) )‏
    • Operationalization Examples
      • Concept: legislator ideology
      • Operational definition: Legislator ratings from Americans for Democratic Action, which is set on a 100-point scale, 0 being “most conservative” and 100 “most liberal.”
    • What about corruption?
      • How would one operationalize corruption?
    • Political Corruption
      • Most define political corruption as the use of legislated powers by government officials for illegitimate private gain.
      • However, if one were going to going to operationalize political corruption, how would it be measured?
    • Transparency Int'l
      • Transparency International is a non-governmental organization that tracks corruption in the public sector around the globe.
      • The Corruption Perceptions Index (CPI) focuses on corruption in the public sector.
    • Corruption Perceptions Index
      • The surveys used in compiling the CPI ask questions relating to the abuse of public power for private benefit.
      • These include questions on:
        • bribery of public officials
        • kickbacks in public procurement
        • embezzlement of public funds
        • strength and effectiveness of public sector anti-corruption efforts
    • Corruption Perceptions Index
      • Measures the perceived levels of public sector corruption in 180 countries and territories.
      • A composite index based on 13 different expert and business surveys.
      • Is not intended to measure a country's progress over time.
    • U.S. Corruption?
      • U.S. DOJ Public Integrity section tracks data on the number of federal, state and local government officials prosecuted and convicted for corruption crimes.
      • The data the provide the number of people prosecuted by each U.S. Attorneys office.
      • Corruption isn't clearly defined, but cases include election fraud, obstructing an investigation and violation of campaign finance regulations.
    • Three-step process
              • 1. Abstract concept
              • 2. Conceptual definition
              • 3. Operational definition
    • Measurements
      • The quality of measurements is judged with regard to both accuracy and precision.
      • Accuracy refers to how close the measure comes to explaining the true value of a concept.
      • Precision refers to the consistency of the measure in quantifying the concept.
    • Measurements
      • Reliability is the extent to which an experiment, test or any measurement procedure yields the same results in repeated trials.
    • Testing Measurements
      • Test-retest method -- Apply the same test to the same observations after a period of time has passed.
    • Testing Measures Alternative-form method -- Use two different measures of the same concept rather than the same measure. Ex: Using two different kinds of ideology measures for legislators. NOMINATE scores vs. Interest Group scores
    • Testing Measurements
      • Split-halves method -- Using two different measures with both measures applied at the same time.
      • Ex: Nominate and IG scores both in the same statistical model
    • Measurements
      • Validity -- the degree of correspondence between the measure and the concept it is thought to measure.
      • Examples of validity issues -- Native Americans on U.S. Census, racial politics research, election turnout and voting
    • Measurements
      • Face validity is asserted by arguing that a measure corresponds closely to the concept it is designed to measure.
      • (Ex: Party ID and Ideology)‏
    • Face Validity
      • To confirm the validity of ideology and political party identification measures, I could examine their relationship:
      Political Party U.S. Senate U.S. House Republicans 20 16 Democrats 87 92 2007 ADA score based on a 100-point scale, with 0 meaning “always votes conservative” and 1 meaning “always votes liberal.” The measure is typically based on 20 voters in the 2008 congressional session. Senators of interest: Obama: 75 (15 votes 15/15) McCain: 10 (15 votes 2/15)‏
    • Measurements
      • Content validity is demonstrated by ensuring that the full domain of a concept is measured.
    • Content Validity
      • Example: Dahl’s polyarchy
      • Polyarchy , according to Dahl, is a form of representative democracy characterized by a particular set of political institutions. These include elected officials, free and fair elections, inclusive suffrage, the right to run for office, freedom of expression , alternative information and associational autonomy.
    • Content Validity
      • Domain: Dahl book
    • Measurements
      • Construct validity is demonstrated for a measure by showing that it is related to the measure of another concept.
      • Ex: Ideological identification and level of education.
    • Inter-item Association
      • Inter-item association relies on the similarity of outcomes of more than one measure of a concept to demonstrate the validity of the entire measurement scheme.
      • Ex: Candidate strength can be cross validated by comparing measures of campaign funds, polling numbers, primary votes and newspaper coverage.
    • Correlation
      • A correlation indicates the direction and strength of a linear relationship between two random variables.